var/home/core/zuul-output/0000755000175000017500000000000015133744341014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015133755020015472 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000311630215133754750020266 0ustar corecoreoikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD i.߷;U/;?FެxۻfW޾n^8c/ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"5Oy+tǭQI%Q$SiJ. 9F[L1c!zG|k{kEu+Q & "> 3J?5OͩLH.:;ߡ֖QʡCOx]*9W C;6)SCVOאUʇq )$ {SG!pN7,/M(.ΰdƛޜP16$ c:!%Piocej_H!CEF L훨bِp{!*({bʂAtĘ5dw9}ŒEanvVZ?C}!w,ƍͩ?9} [oF2(Y}Q7^{E}xA|AŜt;y}=W<*e'&Ж0(ݕ`{az^su/x)W>OK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +mpN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7쿻чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ah?lm$K/$s_. WM]̍"W%`lO2-"ew@bM?O $BrW XWz<%fpG"m%6PGEH^*JL֗J)oEv[Ң߃x[䚒Da,Gܠ*qI@qlG RF]NHw2k߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!s_tean`3-SHq$2[ĜSjXRx?}-m6Mw'yR3q㕐)HW'X1BEb $xd(21i)//_і/Cޮm0VKz>I; >d[5Z=4>5!!T@[4 1.x XF`,?Hh]b-#3J( &uz u8.00-(9ŽZcX Jٯ^蒋*k.\ME/Xp9VqNo}#ƓOފgv[r*hy| IϭR-$$m!-W'wTi:4F5^z3/[{1LKߞ[2nM|[<\t=3^qOp4y}|B}yu}뚬"P.ԘBn방u<#< A Q(j%e1!gkqiP(-ʢ-b'$66|*f\#ߍpg8sx[o%wS`ýͽ>^U_S1VF20:d T2$47mSl*#lzFP߫O֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[clzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IzQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1kw |VvlK۴ymkiK_oK`8 )v3vנ:b(v6& `-K;~:|F6vXpw*t]r@ 5 ƻ7۱ַ P񷍋 3)Cl^]U҅yY9 &K<-na'Xk,P4+`Þ__e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP_xL 2ڲ]>>i+m^CM&WTj7ȗE!NC6P}H`c(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?=b+ uV4}rdM$ѢIA$3~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilPї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`_R<lb#P-^39T|L /~p_ eVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bJȌsK+D"̽E"Icƀ&EI|I_5Uwd̡bh塘ZRI&^{fm&'^tigk$DA' e&2BVb++VnG;\<<-uˮ◶>waPcPw3``m- } vS¢=jڽb\N*s:Mzg =lQυo,Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ)/iSE7./՞ܖzz$^ 2(H'e=@kҀy>o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf?U7_nMBLϸY&0Ro6Qžl+nݷ" 㬙g|ӱFB@qNx^eCSW3\ZSA !c/!b"'9k I S2=bgj쯏W?=`}H0--VV#YmKW^[?R$+ +cU )?wW@!j-gw2ŝl1!iaI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p!;_V>&M!s}FDͳ֧0O*Vr/tdQu!4YhdqT nXeb|Ivż7>! &ĊL:}3*8&6f5 %>~R݄}WgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R$W>u tv8FHӜ"D$aǽO8'1lfYuB!u:%lXZlvwohbL_#ǂsr_d >04SRm+0^PTi-"] O('@BKD6 {NmʐzRj.aQcb^CZ-uvpr CѐٱlGNzIveca=%1Qi F>wTLHUGӃ\sA֎Xpljlv ^tSȻ \cPwίwX"{>9V0ټ_`#U8VdTtD_GU9V ұ{q:ObUi7s )B ۊZlzIA4S#x,T3ѱ ԶJ=rs>Nb: Q6ˌ߉J%.Dl2ȱ%ܱ&6XƟ6qg(USok+Po$lwvmi8W_VT18V =| ub6QWCnY'"*aN08wuSEAVخ m3 o\` sHc# fqT .,ŀU|⦍߶/*~48âF,#[:y_YIpʼn)dk!J'Z5=r&; (y*b*O_ULT.ÔD[%s1,jЅ@k0Ցu֯dtKl$Y5O*GUڇvI`b0ο0~oI`b#FOf_$0!i rS/wvҍ%Eb/Ec|U9F-)L)ŘF`U:VK jeFrԋ7EDYpԽ.D\dNyj荊EEg]bÔF˩ք%EGƶ*NX)Hc(<|q@Oޯr^3>Uf1w;mCja:-1_k٘%VbZ˙#G6 `q+MPU~l!.?I_Pĝ"] rT [eTr؟˰ ]\ h! v˱>5S1px fnk}sRmA>d2UAkؖvlX܇Bz1U_#Xӫ+al H d\k/I,k,ρ|`zR/$@8VU^rcG"E7\qtS:ڝUyy >Vc11*?xYa8U`Jw/AcL~|;yj8TR#s"Q.ϊ/Yrx+u6*27fǪC%+A~*Zآ'ѭnۡ|< a1s\ T5҃FZh?EV"sd!@БU ^p%pO3|B5=2怕nwRqR9~ i±za+HFNi>. EWz:V^&YEs5Ȭ N *7{!fRБBSۘ† Er/IGU}APQT]|XN X]FbKjKdO U6[3TTX)|*H'2U0:VunBl  `5/@ա06VNO8VGON@KgjyK?Wq1egI+ I.*F~L!Gf"LD&U 6tGd#fR*c ^tSLjnKS9 Ȼ \ >lr&}+̼d"I va,Jm_u)d靕َ| Vw85F3Liƙb<;dM-})C?Fw*IJ_3UG'+¨[9| >80\+ xJpΕ`p~mg˗%F Rg(6=/r+%a>w Ohght uЍaRs ^d6GXAf?V_mW puȇ S:tŴvŀU#-*mZ5k5r)_x*8ͼx@(k:_TX%[paRu~}#Ѥr %A%`;MxB[CzR怕#H% }8@*AM.SEhd,rKrʇ)br\+! s1CtӒNc_:F*`Nv;ogQFa2V%ZniE|nZ&-I,t*ώlo Lhnٓ'Xm R ˍ-~ά}hs\5TT%~am.>!LcoJrKmqvez܅E9t6FZXgsreHhlٷ+ [}r:̓?W~e6>0E8`Jq-(ed;W¨:Ä&]䒿e;0:|$Ȃ1L-%;Ƅ{dɱL;V[bp>!n&աIJX1$9;[?- й vRCxKVV+#lj@_RL;IQ8ŢΌXD@Z< (1ZRÜ:OUM/vư{'jYXE4S/8 7: `/ +G\ U>]B2/n2=8) B gJ3bcKo̹ʇ\B~Is 2sO/I!}xV&\b<9$4Nve^آ]$LGF@LjKٕyzH 31Հm-XıUXF|\A-2) ' RG6h?āUŔyj[j_ӂ~ яA弆^bDyzǖQ8`jXbsK?l58,?YP5䜭ve9YFznTEf3Ja\,@2,?WYؾNr<V` =V[oB5!Z\ļǪЎr8@*ucѡv\[|s L-+y{5K@dzp`r"mũɸHNd"yc Pu>x2;W`_VR<aӗ&D<=h-Rר|/r _ǖھcߖ]G@Ն;UQG1 '3Jە Q88ASUȿ!:WѥLf21;d9OU᧯MR3V:<}xXh//T+coY5Ȧ4/m0NE(G2[+G~H'5ipӘ͏O +Px SPp.,?Uv|$8UcY{ƕ+F/)@4mfO)[7#w(Ɏ&QZ7jԖyp8gT(WI|ɖ䢹7 ƪSȫl'ďU -nr* p`Q T#ʢL*Y66؂cJL..TގV{< Fʰ)z}lLegY|)<2<đ LG |Mh D<=ZM(*WqGjHi4#*7Qy[!* ͜0ձ\MUslǐ?, WW5xD 1:W(pDLǰE5=EV </Ixi܌z sr :=2,m{<%e+BtJV?9uÉa$xmtYf`E6F}ۊkEdF뢊X^x{]TQrc/MTݟ2"?|]T9nXȌm+x$r8 AQ3N܍Y_hҚĶ_ŸfI^{P,o z7g;ߓ.B=Sr7}&aZ/IA'Xi2x&')B{4E&W${]9vuGj.GN`H7 #^LngŜD!w_|o2U^>DGyƊٕkF7h܈X_2$ԷDϦS۴Ʌi_I/!SǴ|55QVFe=jFV#f5`DGkSx?L>?eY:~^]w׾Hoŏt:?YK~}~#*ŖDˆI~LWP6Zx2\ JeQE8dMbɫE3ۑZ}6UAav-ƃZ'BDdUI+z1dt?m=EKmQk߹J^OKmG}fŮQZecnĂ8.r-?-ˆC~`ǑΨS4͐~DFuƁ'#Yq,.RPnMU2b,Lɼ.YQK=>{ q#y`kXf I& B& `Q(D>X-dV9I}bxzI>5 u8WAo?L'BLu]Togd˨լf*ZTb/yu6e9`-&IZ-0թel(XYqS7e}sߕ߄}uYO*կ_0>b`)2yl!d@ ĝugjFuzO-(8G?bblMΓ̹~B`KUb:9iPT5jXfZÿӄǿ<ޑW TymOLksSۦvLz>LmYMv <*cD"D$M`,fi%|V'~tGy:ǴE],Ƌ9q̿&MNyޞ'&o'G2|i5穨$0]:Z|O*`i8^xU,BֲwFeyw̻ wTkҏw,e ?]T2¢o`\_ !8ݷYhOܮrvgqnOeJqoq?&U r͉'NSu1yXBkɇ D&O%ǥ+)ϙ~Xyl#!GV9nfyrsڤurބ` P[',},:x,ݔ 1q[$jh q\>aix=N`89l$|9/l!l=*.3l\%$?QS&xAUi?^49<8MpZ=.| yy2*~DE*/~?!RG=D"Ț!\+³\gwY$kl ř*ĽEF32BGן?{o.nq+_)#3V}'e)Q e*ⴈElؽ\s|qE۷E.N2. b?:%(8$c*LX*U_nXֶanSXaow Fȼ8kH]?;fG'jXl~ PHYp@$QXIHι99㻯<%RPqįTk=pKE)bsa%|1x Vz$NYOlsbtA_Y,~3A oDC_>}US(=y*rPԣߝp!]qdIN'璁bp|C%{V30Cؐ'HJ*?i]GbDYtCr\)G g)˗Iud84eUz>GFyOjI2@lMp.jGM-3eB,CxMB%@yi9 >"yJ9+q~rò"8`j'jto9m$͎CwWt-%uQ,d˴ Kwlp-ícb3b\U~WQ,|xbDǦySɯ>\ iƉ 4ׂM^XBҨNB׃Ie~ɦ2Uu {B,wꍬt#RW4ɒ2qSPLZ^ܸBÃzs0- zucȲ5IG5mw8/hYAЬݱi88tsݎxPHx[}x0+J@+.UQw:+L}"8TlWJ'saoj%QGkE泻g8l< cڮ<)n[]&8뙊aϕk8<^8ILV6U-߲h+JP_`'8ha,lc XhֺzVh$65ST=a`m0h!w_vXOtURwECȵEHMMO1K­U)N߿9,*ՕfC'["ꍵHp#g \ 2zbˬh}$`;EcDqe8PĉO ؼksy{x!\׶ Րm_Դkn2^[c8D"W:Z0P?5z[E簡S}!>WgU Z lwt:]X0r(*K?eP#2ægqh:@"6Tl`CGYևH<~jX <_Z$ 2My gJ[Hx%jj[-T$)Q,vldʡ%of}{ˌLt& *rXT76ki#+ <2 $c@WzBYa(ޢOcO\rIX stTez$q)nDlj.&t 0M%&`T[=?l:phKRu𢔩eBj;|2ǺDV ?,DPjQ[ҭhWi,6gX7--΃KU2( pu1p+=65=ч7축EПW3ƕ;5 ʊW fq.͂8fؒe/6. &[VCELT2OoV@")yB(DWXE + -ᮍxy&;%/GŒqt.dBOIXgc*ʶ}V#;:~:Q;raե."QBv.P6̣AB.ذ_Eg0inKd]@qHkՃu^Ȧ1tVa%HfҰ:xnw&My2][ƌplib xxxQ{UmAR|>!<kPŁvJ5D:S侊[H9gn*si6bp0]# Rq܅e.y]F Q5L7Еޛ:hmވb!2Qeyb&oz3}MWzAny3Թ-D1&i-#Yb:$+lAa5<6. \Հֲ~G@s1_"JVaIxmؙ8JЕԢ\bL@r,հGݯ[n&)=na0{18݃!}i(OǙHjp AL:M!!nk/ձĄQJ6y~y0][B0sYF(f"Q 5,NGݛ1m{3l8fD7IIPB}{L0sg`@_!̆aL||;jy2x73h6l\ùoѽC|,M|s4Nk/Ӥ{{8sLu/58 HH\/6鯃L%fT#WI{70_30 !PD`N|l/o昽MMgS)fэ+0npjb 3_Ίϧ(iIuDO>3bc>7_g;㋢R-ljOվ}=[YzpzOX{ˍ0o*{s{a_-D Yz՛扼G"s'b6a6EM9XM}8=K48뛂@^?b9J=0HC/X8=A.٭ x=(~з0he vV @ 2t0 n;~"Hڥ 1} =Ca#*xvؚ(aدLI=,E0'>@ڑPnp$w*;TWȘy~oIa١OE9}q0:.B-A^F2ET0"Iv6baC~D;|Zw|o/׿q=D(Cۈp NY+c`sz*)A^u(C&]y J羚&I,(?_+5PXtM zNPB›[P!zb kxx `ۓK` h(7gX; Oa$Xг<) C R~8~5KYaO59Ȟ"bQYnz51>+|[x4vdnINۋ Dw XD;7XppO ;`$лIG򺋟}qG K+<.8N~*v?)|$"CI*Ϟ.Ά:^&FEPaZFT&e1a"AcvL=kfy'ͬ6H A$J^8{H=Ep(5E<b'u.GCXbsvK&q^QU^k:viRL>DrNIve zy0;+iṘ2\M輺RFeeM@A^JV$Wp[4 m JYEjSZ> $戙|a6"ENHqicɛ:MM=ByDw0{^ۄ >N$1͊hZ$D-ISΛ$bdW#1l7d} ؛ReElj_le(ɦ΂JI5b}Zs5=|1Iɫ*lVUVTs;f/7Rmp#TD$ND(Fex|.I0;1 >)CI5]zo?ZIA dBp@6݁+d~:. :K-2./ *F?4F.m{w}h+X Sb\SR);;ti 0iw6+:.}gY&$6G:L$5c Wl.V} ׫t@ط^6O6O@HG:Ʊ0?2{ƾ8L1/vz:ysP96 m8<8G+/ޫEE0ttHЅa΋H34%'Zp7+W+'%l+QJYVC~2ʋj2." r9-4<4d_|}3|Vk>N.y]Vhw<ɧlJ;NU!`̈dҞάR~ h7{ LMMt{Dq7|؜A6|%}8/}ݖ?%WH댼͋yyJ8 0=!8 9~8zI?ɨR'M<9M)P4TXdgR#jq"gu*/ѿ2AkL!hm0r\}WǨrZj#$Kԯou|m~zU*pG>5)SYrVTB2A_qlr]eU`k3*Bj6=)h{~UZdSmgtLj=xƍo;ނ-MGlxTyZͯrٸ#v_8;ʪWF@58GNu,((O&0LF:)v2i"Z`F8E7|Wr/ݻ Pݛfm|~UEX(dtڵ0";R"4JL^Q\$_Fwc^jWetK6SjnT{s7*)kyw{෬p{S4&J$mHRBigf lŹ%Jk"@_oZSt,:S_0e*"B: oIs9R@NaÄhYʈ[A'L:.nACeXbgopRۃ2.k>;E>M4Sb2+Ԓcnӑtx\G"ߗ7rʆ33k?a?By_8\lmnV m}n]iށv6 /?Oxa"tt- )SxQ-}Y-wKp)0Lݢciih򝖚}FW98Z$JW ) *N 1)xIÑbCqW+dMT)W09ko=zGmFl[M;i.JU9PiFitoaLyG1w H`CS>I3͜Ò8 TEt33;4r E[!|l*!n¤n\={Íb W>Q iGZ|u|DXi3BM$8YA`ѩbb&ڱRg ~lXl1\Jc鼖`K9Kq gbIvYA "\r.bp[tN+&*a;4M*reDմ׏b H xv0x1ospFNa8qL3xKpi_(w{#g#Sj431 ,{0[ "9г}.ɱGgHRȭBH6);36/'#xX\1L䚵+ghחb vm>.o|zH+FLd=:Rxo`D biQ,A_@\܇|]\#\-!,փ\2)3Q[) v9FJVg4d:G;5LI {AD4&g#a2G7G%EyMvm zl6'S2J >99zD2Y{K'lR#5ZAqmQb*1\z#3ݗ cE-QZ:HPhUrk^ 1֡X)/7s"␷XX ૮"qD:,y`[Jџbn*t{u%.1aƈNc$BιJN| 40= ]`<7uU7YڇR7%̇"#m ND"^B4 ,R,A[t|u)X#XjVBϋPb}t6rĆɫvy$U[`F«'Cf5 柿tbpǖL ƺA*0"AxӬFb:)>kz6npUc\ļpѓZ M8m<+o,FlH6!෗Q3:L ~=d'~}sG'.*>4>mL`B$'tD[A%Ȝ-FѪdfuE"畿x6>`_!d^n pB:>=6.U+Yx-AB"cϊD.+^hnȥ{,f핏, 3e0$F`zХX-qbN׏7]p<:Cp'5ۇCNuǷ9:d) HBU&J! 'aB?ҬNgj[6X*5(Dg5 :9IX:yS(^cIwu?_T0͈~XmJ$`יc^_eS킣L fq2KMfo( i35?PO*-VqƊ3&%\|ú fl+ʥ T ݀cԡ(C :{ʋ= nս1@!4En(S\()Yձ tV\TdKiENmnlGc2hY1 `BJR,<nmGc-Ys].-ӇGօHj[a2ͫ{ =8<"rc%1fVLD`2xA\л(f`yVr$;'npaOonhRfx%R ^TFJb)\_TVT_'&ܣ:#3xL4"2}fũAA߲*|Ւk{QR3 %z@P2t\?`?*1gZTk[m#*SE Nu5 ]yQS _<. Ul5o]s -+^JXB uĻ>)|$j YK9|_h2|$RԆ-}E|och%tPœ"1>zB Dn@kz\mnjY`t>bzQyLHIG2G(nfr>D/gM%+D-<'R6=Fjniì5&o 8c-cJƋ$|LP+&=Xg`zg 3z?>[SKL$V#-l DuSȏ8"g.rh(NFlMKW, ڗBD גW/ Qb4ۇ. VB 1gฯE!(.vڱuo*]֬TeeV 9Wr΁#@>4So>@I& rXDtHL POŚ%6(z7u8u5ҍYV6L@t"e=wtSJR*8͝a2igB1`-1o0q\}#$};̹y#Z}ɠxt.iE͊Q$<- MVAA *2+qT8ohݹ4h i_1C\sK3:JU͍:ùjGBPݭk7zd#=O ]T#iԋi''zu_HHΩҗhpO큥j]3Lu~c˛ft^V7sՔO= Q?=eQx肧DyS9VinvlNRڶqc{Φ4 .݇.8;۾^n nT2i`?=54  !.($OE1]oW;c=Ь`s7[BZ=E(1LX.4\z{YW{ IWpƻۻQoovzl`ҹC3&`VuӖT 0\AҬk(Np n][1oN% 4 ISu #w9=XjG,ruYTR[wܨv,>x4ec5=j\FpwQ: vq} ^C(FK'a1Œ=~2+ϭ'oڵ6zfFy D%uHmuew鞧c>ͧ݅^CߴngJ@&׭4kb\I録}jbLĠNDPcLF÷gb CX,CĤ)םF~"*]roDŽ7vW4JZ-0KK(e߮Døf좘/U~ZF939LR6HY-a e>t[vNYGS^Gvʋڻ|+CJڏ=sMR,bR?F4 )`޵z a=4¼frg7c4́E)N}Ӌ|4vL(NT᷄蓃wJ$Qē`A:a72Xlފr@uqׅ`?R?DO`9c j]|KBxY}K'~~r|U15x0oa׫MNscyWڑcFY_1@lV $5K'_\n8Tku` ^&ƈ ҩIЁ`Օss8sN:i}c5&v%sNH5B@6usƳ'qܭqg`a"U5ו<m29ƒo1жsq܇5$;Ӗ~/imEzhRz}vqPι 2g-ʂ5B+'.WIXl" ډ (NOtQw]kE>f8RXq\]RՔc7m7))o8wW|aWGW5=`ozm B`DTӛG-^ެ>5k;V)X@KI-Mn{WƑmJd-/ vBuW/܆MQ wI6Kے!H4٬uֹKmIIhŝ+tG-Gb'8}mǒ杺MZ])btXlޒU'3l'8y&<>1 0Cx#HU[YT9ut'h W[]E(f' m:uVZ刐Ee׻8 {r*.#"Wo"a~y'tkc"q@)ǔ(V[]nA,R%y;S{Z`CY^vg=/nwf]w-Np|rc壬ߋҬ֪ZAd7552EQm/)hL.R~|7nx%}J8t5_W^ߺ+߈qkeCs߻9Ѯ$ק?5!<,SzpB ݸ(v|Q[78V>9kwe4gF?nx[K`U9W/Bͫ:*)f&Ue½~-eaVnf*}xp (h8';g77D\(uD-de@chNKjp5/>f lqZh=Q_'@P)?G֨j{6Ca ppxl\&$.dEatFGUܙ?aqpaq5JJZN]j1o2n?y\CQj+o7]v=޺y;a#'l ƨ ?d|R}OK_˿dlt lH[p7~7&޾e+ _9qFJT+o+y-SSh, mDU-㑼x!*]^5z߂OzQG3} 9Jb/ +h, Z|7 J 8rGRt:#l סcJ iH VjOo0wg<:ǷxBNaI!6e%PEgɸS/ T@|߇]V~7^I\U,3Ta}Ԭ D]E 낋A]$Pi_/wvu2Q+Ϻ>=Tߙrh<Zg8Q;o%vXʖo+4BTo2Y O;Y|Z*iQF~5pDE bk] *,]?:`F_jW6훢!GzT8UTk *Uɹ*ELnK?8>qb?k$!2׬I,A\Lbflek!*E!og7JInI^\1mB'V^c(Ala NDVTŊMpusD oKA!EcXC8&aZcT#DIfi%I@n6HJ R&iԴAڠi uq#3ud%̈Ͳ D(95(AJX{lO(aj$|)MmR5HZ>2}N֒_5`gEMhs6e)gUvb_uU@I_ᒱ߳fTC l_5 E16_&역{an>߯{G~M)%kjE0VP!SRXz<_7 .[Co^Z |)(Zcc65)NPeSQl4H).s9闻[e>ut#"Мջ rk(lrBDYegE=)J?,+~_w٤N}A0Ֆ:4O!u"1KRϊt$+Pe [J1ƘaOX}fFs'PvP⸠xV߿BK ~w⍐9K,6(3`վ29-|E8wuB -,uI} {by~' ]M켓n HVq-FwJ Fġtx1}>R1{ۅxZ#A<x?qX@ح|/IеWD9uBusr˧6%rh>x\Q.xiJ<Vk;䶦^}pLxV`>yF?I}qWrv~A*Plҗ5{Ol^^۶/6D<9?|;пdu% z ʞ2aO0#G6*mgl< $? ח#X~9.ݔy}W>;N'/rg"y!O"YVx)g [Y=Mad9$Ѯ !ItGn/h< -[d}\ E>aVfџ}ՊJNԸr^Kz0¢W+^= -Wf/̃>o.>.kہk('o?I7q<?܌!V mڦRJ% *%NaDm]\ Y_XpO!)a ޛju,V4+\DZ~FїsW=v)M֊+b0#-H'i?0LL:d"R(CѪ^Z^0SMC;qQH5`Fy(dT#$Ȥ!VR,HD4FZ 'E2U F+ºu+\3pyYN*vZ"ӘiMkJ,c@Sfԭfԭ|sf,KV5Y:+$BܳJD5&Ոq(CjG:;]ԺҌo1b+ O]7O˂'o T||rͫav{łV$l('uGI2x3y:~5?U\K W2րu68Ar/w {IlS%Tr19sk LeO|z*FX>)L^ :YXzN'Jã׮'c kv1Ob `x*O–v Ͼ?Y1)e9E1"DŽE9m#(Nh:c;fǘS!&MVWqD}wÖ5~}}i\SjeKN j/:jyW<%L+32IDR@r^S5'5]P_к)Z:hjk%* j"|YyeLVnTZA<2Y7 `:!k8S4sœu!{TN_+WDGO6H_m ;w_q1yqxâė-r!خ?,L(3~d]΋o-rH>pI_ypTJwUB.V{՗~~DYWt &\q1OUZkO^OZ-w+rEq Bbcy†`wi|.T|@J.æ~KS(>YQ:w@Ѡcz`@D矪~+KnGk( L|)xvd8WpgLlSPBgLVo7if, mަ}_1ͤYT 7ΖMg_[*"bBV~vCX"˓8Qeb 2:&,MH#YlZi{rFu^}0yWY5 Z5p7VA 3ƍ񆽸7i-Ř6C "DD;7Fa.%}u*Z.J&F-qJ6y"hjZVZJy9jH,NPO OuXB\L,W୨9wm$҈6a*Edikw{ #(XO\|Jh^#@g:φ+1"L%XqIfTZD!F`l4DB\Tkx{&ChJ-CZY65Ӕ)-!k:61v,_T/ PI8)q,PXf3D^jj~iIZл;9hc cp#'cbS: H@3q…sP0 DaLLkJZf~ n V{ )~Fn%Y`z>`DX\$F#p$$I"*@H Uh n'+ѨEDm?/q|0I|09|0a t49 #U6Ii /Rh5i1 9+lℵ\Ӄ &;Bn|ݯڭ0XJOHKFdHZ#F@/2_wHlouwFz$U8ˁJuIJC"PQ|~e;7G/OkV|>zylQmΥnzg۞N/ ۑ?v+}noGk/}>59я{OߘGY;~wSG |j'PZ0A|\=KrD$3gG'DM 9NjQ8m5#(hm豣xD;@w6 )L~ClnqF`{hp;49\k.hRxIpHj׈Zۢd[޵ʻZ r8F N/Q&M (=cBI OH5kDz$+a{Rm%8m%5ۯ_(hm2ߧ0&D+AX_H"`d"j$v[2Mi_km`}{-#ZW'Z4\8#!G h*?ݘΐG>iyER{ZV5;\f;zOz4&Z]vv>Usn5/'^\'j1}lHR(RUv E^ͅ+/_.07A6ү}]>|_]ɅW_%Ϡ)fųN<߼G sp;h%kVwK{C?XwZR:]Pc!ɼZw콺7tai?tnw8k"--&#,o3] (=_ &zu'Ӟ7W>|hL>(#=` XjMOFb#i|}3){R6(^ xY<O]qS{(1cx;tIyMua0)5Մ~տ+Fl(Ox52w󞹌SRQ@ {l|Dk}Y}9)=T|mO+1|dŇ`螊7⛠踲Wuiеȵ[:#;iTڭՍ){{Ry ^ӪȞ?޿[{j]v\"dsymy<:Pn| \y˳|PU^Xto1N>Kއ(ЃD70/G4\ xƽp:@߃ukeUFƓ F@]4ey}=Ϻ/m?k=j99s&~B'P" Utw&3cFcNCOW;?0<~;mʋy"?R񻕍w]EPO8`;b~q6.^Wqz|Lƿuj1^ V_aZhx2JF}wFW$TWq>obc3 Y^h )])!xS{.q^5{ ltbk+w37S!VRQr 5x8V !ҭo9 wI:Of)J49 牻0(U/+yW0.4Pn; ^a;4S4e||+ F_=&P}@@N'yWYhqn̬:ҭG!mesƓ%8Ep|,\MUyA*_1MzY1/g9[fɠG)dhA>BU_V^D/ZvU9Ǖ$F]U"4Qa]{Ibfw:P@1d&I^`T#nDQUtͳ\Ow hb$2hc+f۷IR͠o27FX{H=w?ՐIH#CjV)*`zjE5azq2n6./9KkǴ0rʿ}[Lf22mP^'5`n-A-CP#bfw lKpoywlMǓ^Gg@) 1}wk]w)Ӽ: jV4fǭ1{P#W4Kh+"\ WƆpg4(kwG- ҟNT糅pT/P~x/+ϚetuWti&^^:sfcw]%4p}97uʀYU5^+SS|1˧_vx0a9)ލϧ@uv18+V"UZDh6&1a4EEs[^R( _|c]S,8-N6\ M:B4*<2LJJ[_;l-m+ɹ|`ϪpY.qc{rUi/0.k`ngc?NΧ9X8[>+3y2;3ognpl|5GT ɸ$&Syon?Iә.|mp> ˯|_Mild*D愴ηVe` 'j >=a<ȋxR #~GE2i a_fFoBe _QM×(;D 㲱֕c0Qgxrڨc3Gq7*hp)31kUѬ2;eF7գ׋,V`7oKdmTlZf4}u=-jW5sXpc1,2nܸN~NPf$朧HmⳘk;&j6vɴCҾ2 wfY'1o@úr%C7F(p?ߕfizZeUnmPC,.8P-$2gm舗?ۜn\7<ZS`CcBYtkL%Lc R'bZovћ[C]p ]n%.,8Ǐjbg ,s[ռ?05 <ʧ~"UP\Uos_2juqjQr׫~W]/-|zWE|*)J$)gɴ(`NU5F/m៿zٚKUfe՞bZ8; DeoҖ6gʠ~Iڿ{J^VCe)6&m?/TkE0 UPgŐ HT$" Ka!s/ `Lg"KZyW`^\YCa=9xjdT/cK_\f鸀AyxL4]ҿLC!21SFs( 0_8?ͯɾފa1M{iYl`zd5[DbTTx[Mۍ^,KЫ٦$^;_UrvCbE~T N*ŒxC5y9.m*-B[TB&7w nthj7o"ɤZ'rgx]2PKH3g%V[h#^X˲Q6/6Ҳ1=$Y-;͏1]EEG$lf66pvklk::{6~EOQ>'?ػo:) c&Fh2n8Y̪8o-*$h4F0i7F?Y^XpY#!YG]̳"B6bp.ij2S6|[comj3op6埛&opu[H) C}ō}{a7@jSW,WMvul~STlIӭ }lAϠg3h4 }TS)Zb)6]iȖ]u?Jy?v jm|˨~E`xwsswjkg;y|}Sg&sy 6vCn.mvpUWwM9&ā H+R2 e18/X.j:pf=3REH)&i;ANYa2|,7f`Uf;F,^c!Hp(I9%Pbe*De0UI1N.3pX;=~~}e[DdT cHpђlAuFH8>j4 H &G-2HAs|咂Ƭ>b:LX%B='RHNP\A T?`uT5}M]].Q*1h#.Xヷ)qKжaQcy;ek'\ef4 $ `)ힷMoo`=OJQ׋7έ&YZHzhYa9|$Q ~30kA[(|&Grw]wqc+&Tķq~?]w,for(Q+;-ge텇X߽Pct7Njk8ɒ`9IYDv9Diq"n>f8Qhkb1V"76=o1+;Łk p~/뇸{Ҷ jzՖ縕8ߎ㨐ڞ8_~_Z \_l"/rD9҃lZܹA CFdxeIyeR> 3A9C8gHVehEDJQ¯^G(t~\m0K>Z*Z "< T-uuLdh@+ |,˕Rx{v })lw^{)3>RptH5 q;<]nflh F'%lڋ@zh+G`Ee.94$m>,"ZJ V C&Zӿ)۫;u۪sp:?~Io{o;d|,xEy\B15)!X'{nhG{4PcNѤ\rtnRD*r֚8!%Jph"#H K\Ir ^DF"TwP}K|~etV~:v*Y.ޮkc}m"8l WZSr@"t_ȣƠOXr?@3,oLP"?"I,fΛpގF8>2:^k5cEШ~P&csoɭ¸Ԑ)LO$7j"6| _^.ߡ˷ |1wv~ ~PpŀyM!܇ 6E"=4s@pc{6á[w1p3>"j\QxaZ递bdE#2DNnW7hKO^#pukQ#]VƊ$< rݵ]GNzֿE^5 [jn< {~0/(qfQ aÚm\JhHLFS3R"hJ˜(o^> 3\XN8jj {wʹ>C7{rJ\j$o3j”);B',i oCS\y)3mhvB|G8> |$ˏ_g7~y{bd}YӹvhK |,-t,F+dj8!fEL8eJ[Q#|7n9 {hl>;>W߮,G5ШMߒBsU5rpB&2LGDШK|R^LіNXoc^iH5㪰M7]:{DPTijVcܭmX۵PcYlXEQ͖[TiU8oޫH,Y8 4*m֒CZr(,0U>,4/~ܱFoTJjR^hƩTO+R*>A\qUW1\a4DXQ֤Z(㈱"4,iSy͓9ydG8^ |,Ŭ]3i ykT͞RzN;x>e!ULU/Xg5s.Ƶl]ȜhS&ڧ1K;Łx[yX.ɖXJj@ĦŵǫԬ7l-oՀTU`f,AzhTbM I!bQgkG g۾ qZa 53>vi7isIsCf/Lإ (od.~ExbCFѼQ9 rj m;C">TܶrۙPԼVٚB2J$n)ॄ \MĞP*fiw' |,w'|mAA7Z}]?`_n^yZ8>v<5ArT{jU݄!*VmqtCkcIuRW A,*1Eey]Im,X29|3%[-yQc3޸tƛH- M7PR^T<jG: XO ǜ͢c\9^S*Qg2gI's@A? >0Ae%i?'^s3Ƹ/M,ìPcY=FoWWb%tfh 6EWǬނ()#TY馭1=_X'4feШٚcvxOߦ&,oꞸ^fES7Td(^ ĨN|tuj{^n_F%{Ώq;># *@mnT ւ]2%R1E "(08{2vG=(ྣqD}a3  aKnzwBPlbw/N\08Q묫z: C" :\~ He$AQ 9-Ӊ*Kp.}HQu;~zб`o)79j^!;REIÕwLn㠞8na N ,Q}Y^)+iIBTI hڛDF/] 71ƪfs»p$|z"4EP;ްۥsw;]I*ţ!1#yjtASD`lE1ޭnc-=NB5&Acbv3>N1lNbJOᱠ%KA Q 0Vc`gYD:#I`M NƸ7$\[Dc%jr;WZR;bERϟuTxV(OfO*# 6c~Xo$iʼ-| RΎc,!nq %^+J08n?C3kؼɜig,WiowDRKuoǀmT:e*ގCԺh@]Ӽw?c!Slra@Go+^Z=]8%hpwUe˖T@nnf-$KT*IN}-e':BUB2)Q$E ;ёو%MbppG* Gô]¯}vDFaWJ?|IrO2 9H(aAJښXyVxГbH7*?/l OFeW'6n4RrDm7UVd (-cctԦk"gƄC:HbBmZJpU;.R( p;t/ #ϣ=Irf=/9QƠpޕlx[g`2ޕQUHwʋ TNJra/C!xW6{E/`vyJyDf3#D aPT+(#pT<U7pTz1-] #*m2+2y>W蕌p 9t~=ۭc TIxj:WyQgVZmbǤ Ըw6"Lľ/]qVU +{lzhQZ/F]n1"xU^ڰ& CrL9%CgA`?k Oc9+!ҁ1kZd^&`-xc!5jCLxWݮp%M.4d%%[) }uQwc 1zk{̕C\m h7! ,[KEK*c6i.g$t/}Èڠ1[̡?X<iTn196PE*LŁwej&ŮKmP|^3)Kw6D]Jm ^MZ(bЎ9{.;_ϒ'5(7GU}`!;a-̐@i(7yG#]ފ۾@25-iQDਸ਼Dwe -SmzT |Z1tO~mDT{C#ОnO>ϫT֨v89BpQY 4FNM}>˟ًa-)U׌DlDfLLT"Be~#e`XAwdyglMm?Ք>I`eSMdD|!c+xS1sS1xWs}1/-f3KvAYmNÂX@>zh  C?\ Gǘ4R }ER28IՓ&ST?wg55m,T9ڪW$7+CCp|W4w屇$y=I^(@t_f j~[܂G2[ ] Tz2vC8}WEU8캁jNQpѡMFp>[*Y%{\&͍ov1RrI|KKVVzD=XWNeQ߶DPܵ7Su*؀-\1"/*]uZm䊠knf?Nχ*rvN,`6r.~gA|ެٲFيǚ _e1)E*wr=ޅo݋ވC% ۋrъ#>%/z)I5`(߇YcC'*r?>L068\l5q9t\QRp'p/<~5XY/ ?Iy)/bu۴Q,+>{Mn@vr4[vUrl˫l1۟ĬOv5raR*n&$4&dR+H2؀~XX%;ԊCc 1T`2ڤ-B 4r$pv5-R@9~WY{֩Ki &\]D{D0FΙ&?˷{D 譣s0XøDOj:[nJy8FZ]M,RIC-I⇾Z'B\}(ɝyk vCOhwP01O o뙙kjwsz;Yh7#'o}; <>Y2?C\O,9b2O%~~'t)܎fy0L<v]\o0؁}OJg |@`ɋ~5Cdnz7;i}." D8 xȚ.U&ǟm˲.8tj{/$t9oR ;c0и|?0=]cѸmyv5}U^5 }Y*py)0Ck >ag!#pe Iz ]W6bHZΒ ii4$.%/;T:`"3$thOse07~p t-1 jdtel~BNZ .Tt Ux\9YWmGGv3`̐׷Pw׷z}ՑeEhTu#/<KUZjd9h1Q˳ǾܢnBFH|2F~ۢB٧ܽ3'IѾ"RxbL7,s챾"NC̠{ev$1G& `cԃ='=3]~$_߳6)zƸyBSnH&}@r\hCggkJ==s n U>> RFY b6CLTԂ+1 {h{۟Mm\5#,G\賶+#1*lV%*"\ /> I^#<]#$E9`HLxa<^.r}gݕǮ t>0PWki\c堺 rv]E-/HEp5.t(c=3'CZ͢;.൵/jyPW.\=˛1F\6Q@} u Q>gP$)D_P*.?x̞ZT}gCRs0Xf7o&cCg7ce04VdZ=su_Nܭ(MuB"GzfKx%eɲJ´Kh3szHyHxFΈFw#Ӟa"πИ &!Q4#/bC 2ǧ)n^Q"K~)ĈIf]XU8`PzB~}""g酖Fa-rVbB18{y1I{fIxJ͵}ݹqk㨦D 4zf0/<_˖NR}W|or+-B;BvqŒDltvޙ]>lqsuFݙfb6mLW̎ \Ssrǜ g'a!ݮ>?t >+\Hxl2M&{n2M'vqMR@^.܇I~*|a#7qzd>^|߃ T1{xIjWK&F K"bBĉ5>$~>`N-̕ڙGf}$4 xl2OSX2dlV+vRr ~[L浝LmQ][~V)죠0k:xܥT}!_*~TG\D+fm&泴Xe_UJ>ӟ4ۼLA|YYX03 on/4λ/m?WQ=Ǩ] ztNU0 V-*>ٟ_MNx XDx'.2`m|h]\ qլ0&v[*V36%\ qa>XXpl ev#A̲ez{sC}уVr@/'={>[s0_\MMXf‚煞?Qn݇"W"F;&dZK$KSMd&Lqㅽj`zj``t}|( GQйvpciހo2fCIIOGZQ\Ab"RaLlM \&4PYq>Z [%S`.q %PNX~M:XH4x[9%Co$FK4u$Zݻ=JnUh x0h[v4re݉6ێz`+qe71嶪e4x=2Fe ғ5( 7ЪFTJM1B!OBx 0{)-pcul~VCuEeOif|d&H('k .W E$\&wTA"\,\hz;?5Xa\`'M(M>Ũf:U%WQ' eMK|ka.K [ #/IBV6[Tl+5LWZ6Ȃ `Z8a0_'RQmyAkz!"6D 0V{øX8`$R]/@% d5z<+ڴN|h-v+bב&(EzZD E۩] 0V0"Q i76PCWSF3S>*4Us⛼.4?{ȍap=Y n$/wn,9la#I-$ٙ [&,`կr,qSl>gF<肛h`lA2nh: RFYq)T_oiaw7hr& b]먁w ՟pr\% :,pMLs$t TE0"B Ua R00x&߃I NCTq@7x6UAxVm`+ ņWv^Mש$c$x+V,LKBr}ܭnf!b 5I6~3K䛲pTӄ ̿V#rIH? 'Aħ՗zCh1Tcv|:aӦj&D&b49Y q皼~xa 2#t==j)mXռ$8K"x{AZo#y!ģ^* }ʠBY_%K]0>LAkw! 'j+'̊:yo004{XNE8"9[S '@Y8Y(=/uzNz(jp=0) uLR%Ϝ@zuT%3@p|Ń\4{\qzT:Rs%'56B NZ \ŽC<k}.bYcrd NS5urC%?z,Fb.Q; |Eu_?^RWlwqSnKuBzY cIE,WZlyx.:! /sg=zFbn9A8L)5"fiihXƉ`AP^tI rq12BGb [6~Va31<ƙRșL"wFsN*a@Ӂ:iuAON ~ 'LK˃ΤK!#s0&NҽzByY(0a#K@9 9&]JD;*N"kH,]#zrכ>LS77UbN,\<'o>ߔKd]n Aǎ^: [!>38<+ b+!pSp#ֱ<1I`!B$ 2& x1>$llPpŒ:91ac 5KL ?N|'C>0TlmkmTG ֽiEPv=qHcǛN- B7"G`P(DV7G:} 34a,=3$/,Pa+Bie8P ӕ34ĭ3C?B\Sq/DT\$q.a$t~۰g¸mcQɂ7W1C_>%ȓ07}7_*]оJ}sOWxH #V06/of@O\!jI6k -k,!nf),߹E,Bq=v|AƕS iiiiI:5Ai-yDiJ;?>v7uE2b"ٝޣ]ކ!74{Q`]v ]۶4oe )0b,V԰&iE)"(窪^P\-S7}Ч~\~O|Z NQLJO<◪pSg|A;~*EEs]˶da\ֿ̮.|U~Sͼɿk忚b? 7_=K2 y&k&]Pbn߽ ynb3]9$.Q}Y97*c'6x1K? c(O7=u?՝B ƑLkB2] 6K 49crug Ӯyr0?NEd`o7:Q~?L+*Tl1OWa;x82;?if*8KlŎ&w~>Y(^jYGxt,gOϏy 'Y_o C_">]W]U /b~wW,xa?UV}pݾ _ԔWjAwcPdwM#7Ovy|oDfWQ67؟]Jl=+?gĬd:tl ⇟WѪ÷`nmt}0Š 0(wC, 7hj(YsRmm30VV&M~ ݭw778'&p. WMd])`4Dw鋸m^br i`QZyų\,bl>fu [h/*߶"[/N r]jդ={&4R58Z\}{ɼNV՜Y;4yV M\ǯ%=~--b_1;ޮ^`tRzo|,LΊWK7\ZDP)Q2IN X4c OȊ,ۧmdEݠ FV62pTdeYg  `8ޙ@%.]B@ -M|oZ} v9p%ߩ&;ӏfvlQ^v8ܠPjNǻwyp:&_N]FFzF~ ?Q:/Xs#b{5!81b6͎%gNbJ_[#ɘQ!|RF:}p3xwDMtBW&uvEUa 2~d+VA8u OYrdߝy\L'Mm2ϩ t mG5Yh>F]-/[705?ɻ& #U1l$x'Ea#+?. Y-U\r3NćYFBy+A2PA=E%8ʴ|2[WG@JI $abW )_ArGX}$(O2`VGxjbTHa)jDLIMUoo CKP$閐s9(SG/Q4X]f(A|tnüۡpG=A `o c锰MQma .2sy65B<ʤ7ð$0H. )(CԸXQ&K0 hQb$FhH ˈhGQ@'F52H @o xzOD>"~FSn))񆳠eEĵ B3Yn)7>e~M0a#GEg9F&A+ƧCEKHƯ$%K B p"ȤZ^IBZn!#$cA76:-1PGlzb[pZ:, 7""C'd4lOkϦ*@B Φ*1D&U 0õ:4Y^xd\$m*x{bZ'Rf$NY8EndS87$&zkDFBi!c8,\ْ3D Sa\'R\H\Y+M%Kt(^1|9ㄈL`1C2Lb B.iGujEC9Z c Ɛdy/%Eu/AZe¤q􉫬sp_Q10VW((inwrfъ:y*]0&eX;E5I5zG.gĖ>=,5uk"10(Ujyl= ls׀ɟgW) U~0HGna_QGh-eA=9N 揳NBvH`RRP=[)д 'ѻCݷsm'>D"(#F1/'}8Drɇ\p[8XjKQpJ<@Acc^bz3aHLZZ8pǡ w p3`o"oUDܕQ[XS(,)7R9/#-U& L7_F?ngIK#9[{=Du0rM4SA7  TÍ-~G8?r#o7R=cmNg+-8t?EyۗQ3ow{VLhXuv1w󀛉n._{3x6n\ڀHfӺca\y0+1ZH^'I3L*=Q =jH(q:1N//rXE=Ɨ&2i= @zn{B! ~9; ez( ejQљ=4#1=k=rl[2FbnzNfLԕl]E2lf8HώFZ8y*)OTnʌ}Csq7V Uy _E` $UYxLXK)4$g{hdGǛ?fp@<&Eqih)jz(̒SNC#38xݥfS0R#3zpILdIǛyO>gJy|rz[wSׅ sp( Q31 ,; e ]%1۝sk--\t-r7ށQ3P='Ah} -xp]4luvq)Us]jLM=4򂃋{ h=42C 3;}aQ 9|=dHTⲇ*FfpR$. B`Ci܍t Plsp਴ɵe+3k7䅇q4>'_)gT2wU"NuǏO,4Qh0 \NUDCJOaX 5"0ci0Eό  %KPf$Pe#wd2e@Z* TxcTk=c̀)'Fu(?۬#̅wbkvD4DGi|!:1, ?Jd!8eK|\ rһ^sv(>e)*6x1rl(nl$':]8`0}P ke^-*Zl#҈VU5 F}q{#7*LF^px>^?}.n[Z1=42`_aHƠX+*1J^` tY<4y ̇9e[w2YJ*Cj, #._ԋBr; !+ǔe*=m>3$*}Q^&y!%Za I8 t!Z^$~kk~Pګ"5^1JMm3"Ss#C`ZXT`jʢ0*@{hd'Aܔpf!A}Q >I3bF"@Џ(VX" *PiP6ZIգ`{hdE 1$Ō`Ō0/T87QGd2d8~Xg>Tl.*W g-_ݢmI`/q1^$5Yi\{}$|l!=Iy*XKP|dgU n.CpAuQ$e:Fh-IeF,y[3&!I9On~a8#rL hT-_PMY g\}`wC^N\,SegPQ`0W,J\ ޝn_/?v 9@@Bp<+{&\n@|7q׮n>nfώC=Kj?܊SӟZ=oA̛_PCz2W}kЯLt~Fs@ݴ?ut<^7C "P)~~ ~YO7J 4V1Fwc`zX?2֯N7iM|߼8t!O.Zⓟ&tUAMYw{eo@࿛KMx`ʍ\wPK-cZaG+euǜyjKp@PJI]̇>s:˼uqXG[ե_x,=j6X,iwsh>C ֓Qg>whL2ku؂߬'6B+M0mZhmgsLJ2zW B}zӾEG_}j N`Jj.ž-|*i;lvadyc`\X޺:[˓T υ4j*0 $ ư` abLKy6ࢆy|e_:s@"][EL#VI D)}KA+51jD~UmFZ}%ח``:\iMcSʖP{Z9f*(JKaƗ*^a PWej`a 3L&.h7j;`WeW`uv cmNg?6j7sޅ؝ӼƬ<̢t1ݼ0й 7~?{8om:jtm 9Á7tdGMkļ#z-VN(}vfI0og#?^UH@}=;{Aۣ]=?unk͍@WT#t<#ROr7nWb& )#c̃OT!)/5;I1y.聟|蘑z@P ~yLD^LՁ[ ;{ycx'h nM5FԦf;?QP, NSTQUoNqj} }5˵RK԰d c-۰J yS"&wq$P5ڼ&Ck=0X? M|vZ}>eLoΆY?}zh!Wx*ץ,Y2|ت|۲+jΰ-\ =La_7.YW.P.T.Ws/Yl߹42Jb 'aj#LZltZ* `ʘ/ݒ:Ȅn_`ctޓnd̤- b nu Ư!qD9,U ضM51 qr-6g]rnTݝqcڠCl~m@|+ z6e XEQ!&E X$,s̟Ғin'5l#,WUnYxGg4[i,x|n )҆{Q<4  i4yA M^wX4uԔf&"Xױ:p+0uC/!ϙRf3W&hcBT p9}o}ot{ր%8Ԅ)\AajJ7TOdQ@1SKF0jw7gN*WqW[BMUz 1W!?LǦ'@N荁-HbrEP+rEy\Q^ɥt2TհRUJU +U5i)D{& 4H#AB{ԚGSRbxbU)}S@I߀:Ѝe4lVLf4lVLc:+f:Bp! $Ab@N>!9A3J$<edcS@QǦW- R2SvΦ9`0 4eh W=7O`n`i (iE{?ˇwtF`{1_-$W[ɡ{$^hڟ~8/+6r?m@ \#.;ގ O^+a ~?qƈ|1{TFڻ#e.Bs$ Tü->K/vߥ&҃W=7޸r U;A,w3/N{AGu{{?g߆kV9*adm+bQ^Uj'ʹߜ4Nkg}s=^u~Hf$3>"{yȴAFRKC ~"cx|?KQg1S,"m˜@RFx$dĄ_Hb" c${|~T??1! {"wnVaci-X0BIC<_%~^Uj[@d4#$1:4 G,ϙR ;8bZ@8\p(r8=:uxQ"h.J 0 Y%@2":4Z1 94gazrИໍ4fnZ5"QD(],%a(Xc8kgr帺 RDe L*2!}DfZ =n d *\C/6ud-Eex#!:{HLF  q| ,%> P[-u B'nϪG -E [>a8OkiebfOUW]&uNg}S'' Ljwf|cFŸS7nS>Lz.J}Q\/>4UwX1 ƻ;ǍQ;0*iͳÛݖiuzW=P&/8^$we_oIF܍k^dzG-0rWznѥƦ>|J-V6w0n6#sN!jڣ5zp6`_k_Ifo%QW^3<@:N})3 &tlv*'@*he=Ĺն2u,x~#ual8d8amq&&o^djVzOIkҷ>F;e9%-wRѲ1Ĺ=gY?q( \0%''9 8 8 8)%o&'y6vq8_a8 ,fq,V"󷍐Y' Xnޕ]z=wwyBʳ sZS.`q/z;̴*hygUe0:T0RD{ H 1F1Di(h e\A\s{2(k!]vig B:&lL֠4_ڴgg{dWp?Dd`g0_ȉ-˶=_s1BͿց ﶪibK^k$Nn.0GG}&28ﯛƇݮy'џJ|DvOڃ"(XXއu3]isIټuk}>(7oϲv]f}tUXy%oLWaTF{;+ ViV,MvEVZ6 (N 4 _UXʅPN0%<^wV6;5x}T}uTy[9y%#27vRW'Ճ#z]vk9=f{̦G՝׏❚)w G? λB[=9{]E]s 5)f*6㫑3CqC3LƲUJ^= C>EH֝R2ZH֢i8U`o82xwOlD;߳khYې, z確ΟD7b-ʹE9(ܢmZ\+hfV!V@m|v|z*A TmTޛ,r[HQz/ør{dB ]q<^0qq8!YH.u~o~Ѷ0 6{.6=юrY!!+oAĥ|!K @e#ƉДD#;8d-i_0##9 0@R` IfG!)  MYh$l3_zOZ"fG}etnd,λIaiHh4$*14>8VNI\Xdods4zl?h>kY(4r -!l&/F0ƂXűXSFXl@}1 UYuʆnFaX%0.aS6$|+tm!0`Mg]{7CWdj-ݶe!Y:wjŊ^?Ղ"jxs2 XpcZK r@XJ.BUUgCH? n?P,AV,VLaD (_<&q|!mr$y9OBI3+,jg bG 2Cokm+4mWZGg7VoF13P= Տ!h`plQ=HA"bVԼ*@9iBlW2X !V%@.` 5, ;b+:Vp|t5]T35 3{ BroJoQwƋ$GR業ikL97‚ oO^QSb }xxȴ/I;n&/ӱr2S,i@>xp!RCj-CCsL,ZmǟQ+߲Ny5@/h ꨓ.ĒO(ofF,`eþ:KEu6*FY+7_7ʕ&bq 0lE(H1  pUAdmx#TU@m:b./{g&CTD\(_m}]wZ][Bݾ&&ȓemWmX- *ka6VMm ºc>X>pN5_dvUQ+k9*Qe ap(hHƚP./ FZΈ]Zo@M.ݩC$EEG3kL1dm3_/KYB(ZGJ!b 0a0qs>pطDjg_f+~ tČVБtgز*so&A2ĔVI%,af JN5nuږ#eq!hRx+TJkjIO:yߕ?k}g̩'۩ٻji|NByXӛ1vlՉ'0ݥ;@.$%Vi:mtdh ! q-4 ְtyw ERìrN~q u{;Y\X/{ܶ d.I ]85M&N^f^?,i.HJDJE)#gjW.]FwƣTla"Ww3\5En|Xz jC˘uj7| 9.I<={ӞlyíS5|'{NWc۔m\`{7{onx ^oop {opMop|r%/Pؽ.xTsJ _`NNHДx.tMsE%'3b[)D?)pO+P氯Q3=$spc8wؠ/XFdk nczB&%.Yw#sj[7Pw36?:f)#b,fZ [+_xEelWFrdmjn_)IZ0V1ne"Ă-ҳ2/sؽ|It~ˡ,4k7yHG)*MPxc ʘ3|adbk^7{ Jg)uZmu%s2^ѝ@T@Gp·t%ﴶKÇe˂~W;3z8q9<Md B̥6PνP92 M@7*ʒ,ˏ(GM~9Jʀ pwXI,ᔃ Gt(<j:2!w;\xSKeΒJ|Q{ߍ_>v]eWQCdo11/\ct_?D? (;yj={/;O5aPDԨ/9y75zw\;U`zɃAÛo駟?OO/[= yd7!  f~ALpGk%7"b ?/z<]_W-J 䭤7%sjOj"ҋj?.qF%/TA(b<%upuZAX_Ǻ 03zC~t$-i|CA)#~RɣE>(P 1bA[uI0=5  $yM:IzqNwz7bOr=A% >O0 +N Av֥Xytz^{,4I!!Tv%BgAotqQlpO> LX[?ė)Kǿ?·^z/] krE3FGMups֣zµFsP~K+0}5(FTG[ԕwB-JZu 7tpe2*AA|:&4e k}{}4:y\2|@̎&) mjF7{G`ay6ukYIzCplZT,0ps6 @:mC6M5ό/k.{g&3)y.$e㇇fz]ИOwF eܾD)= Fsu!0PL&nmbGq3HQ`L$hJ5m&HMcގA?4"KUE7;Rӆmn 8m/QQjb(lԤԕZ&L-E3MbT¾qbIq9(‚WY!lx17av S'V9} T*vq_ZRqjGҶ[oY0/NDĄ[}jY*u`HaS!%5z] *,;Lmm<,B#t L@FN,;UDR: L]jUeҳ1j?"kM)7+=ʁdQgt&LAbE#Qk\En^MB:54XI =5`Z0jTSaZ_go;31Ђ@FiU\[548m_{xKh(|x^C&03Zm:@ᇼwL`Җ /2A?+9+HOzt3ZhflRKJDÏ[%ez:8}ӗ$Rd٠':`̹ (_AOZL9\5RRO1iyBKu,)WKAXΰe7CcR7D1-[ wW;"3uʔFZil\0nQ ~VN\ySk#J27o$ 1Jhɫd䭞Ue.pkڗv#jX`s%020'.QS-f|: A=bs@" ƲLEu6EzNtf;m/$ Iꧏ֣>:xsH_%^yW3ҏ.^VW1F,UzZV[RovSͪ34+]Llw,ވtؼ! ws+>f6m1Wh}&uvb4៙M\hn*rb0QuFfv%»L$yʻfnQT6klrЩ a/Ata|vCZ\ou~:``q06ji~-pW:1`0L98z}7Ix?HӁ?rKϝ:pt,+5]ɿk(T:p'jr nCmOIoT` -Ceh{.SVtYT@Ů 6VNjj7">_1]-_%͂ol*H[ &y8f y6sn&"vz~=O7^?o7z!̆J# F &iQhY9&]ߍCn䖸4+ᴨŨp 0Oow0nY!蹉 jAR<+h+ghbK1v5o /ڙXqu!sao[6Pʽ=KQiAև6mjg9c~Uy,|es/]7 0A>CezZ3AfM`ׂ9C8&S" ,c-PI΂)|FļcM֌n[,_-O5ĴtYaӪ@u~ zV+)`*+w(Er+xKErir3Rҁߑ<y~vʕRʋ<=քT%S6bIC8n߲,k3S(tiKaW R`9_ 9<783pn0As]bOg ub,} pi|n4-o!iwqX2/S-z#&F-:/b1,\}q9T<8X$'"RUlɢsoiI3e2F  LVp,;A#8.،ٰ$+AIKA,];ѹU(,ƑT8A)\9'cѯX?ȺI*4MTG 7`LM7}eE+Q/_EPNI6MU3 iTELVg\XˇH戼`11|;-Sf;i1FtgdFx95e5[}ںDh1+hyqM+\T5ϲxkَg^@=E] ;w״ܺ lk&9]u K/&i5(x12̾B Aڄ^I3U1p(r|0)ڢ/ Mm%;e/J/&;+P˷(4-*2 siLtlԅݚϻ KMQHsNF`0kԡm3<-#\'Y`q+ = A@R 'mʅ#@G"\ &2eٖp_S|)VЩe W1li{ ݛImϜߚYrgk͔J{¥gfLj P!Sn>p|mvb5q<2\q Ω0|x0;B3gYLnuF?/?/]J-bNHO; n"l2Z>._ܻ\Эs =P`Պ%LwsPmgYB@j*`8TRSeQӆSHn lh0}{Z# Hӯ%qtqfI +8N4cOٝ>R "td wC6{g9sgP{j@5;U~(ю}l i[-1|w|R)о `.[#EWX vqbs EX>qsDϱ,Я)bּ{^ի֐xm/o Nϧ oȆVSgJ>@}Ŋ\+xL|jXyUYpdaĬmlq.w>tod @^ Gr(bo|lpv>mayo8!]\e3n^Cu^[슏d: ϧZfn7؍c~d7槏c~^s,Sc~c~>槏c~Hc~>1?}Ҙ1?}41?}tc~51?}O1?}O1?Y|LR^v+ڟ7Qc2{)P"6`$0CYۥnT=FZNU=UD}v6=l`6XOIXLsIu`TЀRdɺ\A[H)]*l4V=6kf}E{f6,11ÈDpTkx.cS6X$I+S>* y͜YĦO{S(/Nj}ΏFz΢TI2cT-cpj.!xF 8"K,v#? 3]aMC 3 T˔zA`8 *1'A=! 6 z>DHH{'mmwV&]B n~=%n}f}Q׸YiI!loÍ~\~j h0OsTo摒v>Ūf!#غnzqP/sNԠkr#cH'`ݽ\ʄ\n}.s$cP5L/osb4ᗟEufVּۋOhGGӳkiN&9ނ)WMPe-s@wqqr̵ZLUz?7̓ˏhA \̩:vo5OV5~>_6 ǀ~;M33qqLaTmfYц87 v0bfb՜X[Y)f!F^aR;F/Cױ*Cr@y{ٿo=J_&q'g7w/O?ߜ|P}r?'޼'؁e9|i z nzܚ?[M42, u%$E>nLꉿ 1Yx5}8;u/&哬ev= UX ,hV|M<аM7e.f lc5orMηyS,W9+?iTKA̝9Ni\%5~YqrX!| HxBcp%*:ǨUb6b=Zse@z 6J@GPa|25$2gh:%"xbAs̀lg)seƸgCftwvS. f0A.a9{m=oo5 )b^`8\o%UE)IA^X(d@>#QdY2B U8q@2am@P|n\t}Px >& KH`qkrO9cQFe"K]Z 0gaa@WQj,J@a( z)iI+'=JЮci} jpn0b,k$2P%7q)p5^<~rp<ƫͯFǍ y:]<vؤR:Jp0\p #E4r`0 N1d"Z&m~bsj&=^[4&#^aKU uR#J6H2B(m(Ems0g(0<9dMJdׂ($8,춘BN䇌PxʋRfpn='`Zd0B /E+Yq*2pn)! &gX{0B 9&Ӽo_sb~zh7!ZwL#NN־ocO}Ag[\4i7=m:.b n8!g=# z wvR,FkRBʶ{Q@[k 8l :%nU ;Kȡe"FH]+ʰ-=Kj6ՖK=_N͛߾t:2ˑ)%+ 3(/0N *j$MF&J7 ‘RQXRYr L U6WDb {I#^K =ME4ۨ-Aa\,#4h6KRqPP$i3LH.8yQŢYp NˑFJ)m)lDqYxE j3ݻ<кÛcwJbj1"V*Ha)jߓ8dQ㖎@:6 <$D6r*~EtCE""$L"EZXe.Jd TF:˴H`0E)kszmPx&E\W%!Gi)幱#^0+֝;gml֬.+H|q𯸘/K]hO#MYjL@/Q+sr`vҼi 82,0t5W l|^jWy!cnEPhI: NbZ[ٳXIb{>D '% "":WfH>!ǀOAP%p=io׵e/}zŻ/I P. <40*2IYVwp!֐(LaU ]ξ{:9Vfvk&0c4 ,uF8:|'m؀asF_`)vqpۃi{L\m@؜kѕƌѤ._1J#fgSE -ߥqZJieF Gccۇx'iF$`=^:!<;R8<5x"ɉc6̮3sҌj0[vׁwY2I/Zf75BN)6>$ #q]Ǝ:_yߴNS[iԙO?A5389yuUM=jNʥg&aicĊ9uwYPcQ`fW~&e9a<3eۮd0]Q>sbn'%5ctt*sx ]V>Z8?//Q2D\,.>xeNYR~1wy 3oːϬ$K3wP< lHNpja E `|^]^[d{M q[^<#$\!nSDZJ[|m,Sm:]cD,y#RmRp^B7Z}ݦPģR&Ahsg x/AARd$-~?6O᭷w-3Pn5[hʜ$c[Hd, <\[niڸE(l-6H#dBDe &X$|TQcq%6V*:=<B+צF 1шks%" מRÒ߳XƺPWZq=RDE$$hK^|`Q .DX +8wI|O_/Vgn"gs:z"\ o&de[Y%bA| GC>>Ǧ62oFP`poTsΟsYYqV#lmh'~)Z3{~<)n{={eӋ匣I&e/+y]Lg##kG 8EJGs)QI x V$|.X5=L DDD׫/z&&#UJʕ{&$Q'DDfbd/ ʐA^rVREe+[u)zRf% cYX2c|yd Vju%cg3? JU&晄B*e.yBl 3/|3#VAߖ > %d۲\:^o~['ſ-s9]=ȕo%[Bzo{q*^aP\2=]bo++|9&[?R LCU ߊ5r(`qN ML,BFqq(" =or#cH'm.e.Ct[LATQF"[Cb(,}X:>eھ5W·o{Gܨ`lz7z \HKw_WwUܘ+ߊ5._≿NT=xw7}~<ܜt&sc4_^Mǽ]dnRU[l qFR#m# CkaTL,iC,|̻߰<7z8f?ٱ8pkGxE֍ZVC|q.s G>b,6zi5ſn*&1/ ߿rO?>o՟O~zzO E ~~}&1i:4 547idhg`\\r͸C ʥʭߊ_+ϗX?z3л"Wk_!,ՔP,W9+Ҩ@)';%0sLyl I2LJ93rk~Y$; zA cuJ`3yiatɁCg!X&{L=qRJ\h:լ6M5ߚ,_R|HyHX*pn.Mi-^[tbk[$tfKιh2ʊLPxPဌyϘFVύsc-b^s>8d(ؘAǎըҳų⡘ C Nޯ*d9xgip\O!AVpiٞ9N!LO*d.dwlA& 뙱1$SInD&@ O'>QJ*=bmlѪONcCG(._ۮ]uŝ UnjS?kk&>jPάMslmxhQ.* !#\k?G6# p-/[0 E\ KP$IȍL>&nC9g9pֲÒz`2%L|)D%T{ 6۟p+Y68l#r3"368QpFLT׾H.8yQŢYp ?ౚΜvcrZ9L!]ěJD5Ň<ɞ{V* {NoOKe v,XEDԹg_BE D(9~ΰ-[sV$Y4nݛ::XL=Ag8?NrCѭrO!C )a6a;w4ݻY79q4k{7 ?hKopC\knB\+9pff*/Optmi;.o^|W^,V9o{s?W7WA +Fi8x(m?_` \٬Oїݛ)o߿y&&Y$i}hԢʏߢ_|+^)˶<kx⥧\j5V4Q gk\o-%JtilrrFuS/aK&)0)!NSߘÄI˨ v m"*LǍ  v`K3"ߴ$|x9Tw{͛?%@"Yc : (2V_OaH%ZX KV˨DW )H%JA"AG`\= :ǯH&W.Qu!F~aEsEa#g= >T^KNѢn2iWWR*aR0K7gy|COL (613ERX \m[ۿ"wKxo k"H#6ߡ$۲-J"%vC|0FS*ƭSA>3jUӗfݏrIe7-netx=g6pgLϘOǘ[=1cyxdXr6[9Lp=T,֬3J&]f@ Si0hc'V)򬬨8 Ǽ=Nzy<-wڅDUGOG}btg,v3Inlv3ٷkt+̉:\ dJI"I63d 0Bza;kD_]W0e0<Qew|qdG&G wnvk컬x 1" ŮZMyܸ$i+9'Rn4I&YqA"׊q=r}4 fzzu5͊y`(yT|:;+*2qA jK6?: ̬u#lJ)tIzICC8yRDTS|Q4arQj :iZgz_}d o{] RuwN˟ӃEed}ӥ3 %ei"'!hwrY;eOpX°җ\XJ@}kW^tCf\ӺRS@ U ΂Y< A@QWo@>iNZB]v:u{ A6kAm? J𡸤i_hi_YwR8D}Qt`O` ⭄~m,`$C-:P;DK' ^/c6˚Ú7^Nd8Ln%MրhxЕ6Z/] N>g$#6Z1YVOU nNFaJ( 6wv:ῥ@`A@?]!$)xM.a%  TDM}sx{;П&2,-+`@9X ӵ dđ߼|ouq*e/+e`OYqlцܱSCnV52TO%gC:Ty;Oz݉Y@ !iܸ_fH|Pjz?K! $TdϻB׼.B寥w0K;mS.>}I`0Gv¬8aLXM'G>ކvud u4rK=7n78Ru6XnDF5r2|W]ܛ:8Һ8|Ӻ:yw-}<$Ը]-4ժ[;: k3L~u4tL,> =#ۗA;Dq3}h3mBq יvJ@?w"FC+ѣ(Ǩo%/vxF؍ Ht\Lr$mDXMϾ_Lz_̹bPW$TKNN}sD ;#ujfC^ziwiD!D(p% 8#}]g|@|N8/w:۔#0/?x!"_<þ=&̇6lU9ydiA6ݢ#?|U}]4N˷E{Fb ohJQa_p{C.q4;.' %hPN%9]F_Wa,=/ e\A_W*Sl""`ES > FmհX ЫP: DtRQalEI_.s+yd16UdKC0B g)&ݟ_%5^lϙ3%F*Zoßg[W41Ps^7Tִ5)\뭾f1y_̸qBToލ`"-ԭkv ތ:ag(v+T$'JQF,1nW"pJ3DJÑ0hSKT"|v5 dnvs;:/|fM Ӹl:97qLݥ Vdy`2ծ͂9sNlYs$>"sҗkK1~ g)2U#mBwVRyz x{jo")'S>h+xlYTP'+'+ #;ףƔ m٤|XX/f\};U2 Q.Ue~n iݲ| v)"H[rvڮx33~ Ne 8/ )o2`]uU2< &n&G>[.\U>ߞsm@9(KO@-םϯy?>1#sP%Fd@s}l6=wL@BvUtw].jf:O?W災Я śo_wr?^|cкҝJ7߂go 4B"L$I"KeFT[̒mR- -`Ծ8ň.buϧx]wUo:c*K$%DOK+0bn#K, Wm؍ƧOI*F q j:d07+#U#ZRgF)DpRʵ1,)2$%C"("*E -bhJRŰGZݢTTMsDfMzYj_n2Q_A=k?֮t8cIJ53E~FPfW{.kS-E}pʽü04&v,;P| y|5f7rR,Qan 1x}&3yL>">pSjtgzzulֶ#W~Jlվ#n;sͦ *=q>8ae<}:dYb wjENz޳Cɬj+r_zI>TZqnklpq? (Nz Seo<%ӒJԋ.dfhJ{Ti-#&>^1=c>cny֞C[jČ⑙bԎSu'ڷ);gz(_p(U=FbenCu:K@GO`i5̵`yFy+?>XD^;WM}"l]e1OGz?N\e&~Ya)ȸm9BR*g`c4)z"Zf$fK.% ?K.N%m^u[Uo\yU3ٝEred2*gE_|rXo5Ea*[o5B1r5t61=Fo] 77<`=w[kEYTɐ( ed"R@H-G ^1KC+(`,xB(x:sF&FJSAi%.'d)K9c&H:%IyhK/.hOR;xiNPӦ)kx۸-ݲ-YyͨtW"Fl[gCks   p0;Rm}Ÿ{.ܽKޥ wEޥ Q 8%+̣pщJ.$#4E%(t10?σB 4 Ў΃R) 1X$EE aiY<{11X@'.3hP"U*nI%B S ҆Nsא:mk|Z!\O`4~hw]xKou'VjV<(pT%n9Iu|'_xbF2h*,]gݺT^M9|n.{u`|˲l-@@.J (8plt)c1#,u8DiUX[jXj::<d?eͬؗ$/Sʍ-㈦łi !Y6 r9A] ( 6s#0S1+FOͬ]E):_]o+1f tƜN%ݩ3Kf%;XQnn9Uyp&g|}znaA.st2pY5nkA f?Vȝ%C'5t i톒:56ōii6]":_9L^ rmUbה}&qtEni81g8Qe-ڶu Y:K ǿ8x_oͻ/yq_]ٻFrW*;ށTEøgmoga7jnSԇ݉"KTQ%D,#3%N|چG?7l_հ٤W毦uys͂wx/.r{؇`\Jő[1?}qL] l~^Z<5A#$~6M)i1Z7G%n|%B=b 9Ҏ*k~Wv'6W?ZGr'UF跪$8@Iɬ%T:M "X&ܗYIzۆC|{k/ >ZyPAtQ.ֹ8$ BpEs:uR>|[=zM]4Jvkm CN禿154NeV1rVȒ1}We糝EP̞!BS &cvGT%ZIr;GZrR:U{pyѩQD8꣔7:%O[Ҍxv"=j%H߈'׌av*RQ)nqGq10 F+5j@@XLj!G{ۄu99 \(wp)*@ƌƁZ1b" NVFaP?Y9)TgwNWtRHZ/5˶ :Fb3!N_gp[7N b@D))-TFi+-vMM:9_@Lr:MB˩|uۻMA_xܷhn1nLG&wL;u{qŘ i@V 8[(3{ih;h#fzR 4P%$JZraaWh3 jhˠϻ! zz65ܣ-+ AHT:œZ30ZFL&Z iP?#Oros5' VJMR@Q;7 qfeJ\Ö{+e\tz5f|<#Zvu1%|Ү6l)ݟsi\4^QB}LuS\-&x#32x5sf"pa/Z]k14]kT?leC_{ te@}ˬhd0'%P fh/S`At$LEt!N>dB@>e_E.qVqa@C D9gmHʔ Ҙ>8l%XWc:6@ QF#YobA% 9Ij$f.9c-2eN;%5)RO@z~w_M%:eU}W[2@n}^;x}1 u9[K(e@X XtY c.gž-/?^bƏO:jrFX|h0WO?ȫ'?R{9ߏ(П.5@/^> 6R+ԱDRi#H$M0&̫J#NHo}.@خ EZ.YPܧprfiVیw!qJbչXu.VUbչXu.V+bչ~N.VUgŪs\:Ūs\:Ūs\:Ūs\:Ūs\:Ūs^^ Jb:ŪSu.VUbկpYe~'W*&yR;r (sHX,-ґid8+#f ?uD}^(\}gz3Z7ThrZKs#ɳ1EμNYs2n0{5mnÚ7-Uwl[bK 4VIr *#0/kO_goA*վl!eHЈ:mC;SUZi~ >݅_WM妃v~@ ww (& E4%XIjhscC]eK;V;C^>YsmA-.y ,UA+5j OiACo{;-c1ѵ(Wv]hm^u?mo\Î@Fᭋ-V U;뽫6 zG/Z߻F]Vgtw5K8K!ެ -UkZ~-Ꮔ\h`^ p&_ 2@U&~.\P~g)A t)sbdR?Vkns8ژT^šI/A}%Ǥy1DʕƕDݔwާ[?̪ϊcr-pZ;D-QӖҟD_.}$@@#W s}Xodǡ='1[๸x5]^ m(=KN%_Zۄu)9 ZGK^3f4j|sWŰTk%iO6m{&J y=x@u>}^K"%Cnfz֞slfN ~pTdHl~9%b"32Ć#%7iR1SZd% WZ3z Z18Ym>fGQt-ƭwuMÕ3{E8B-A7 2ۍ锠TJ=:dbL4 H8[(ᘋ⁇qY*>/YT-V+VH+]lՎ/Z*]o~ mW,J#7]1Y$Rܿg]I,jU?gH/aclcXF|0arÖ{+e\tz5_ߪ+3zV^|}o&Gkd_JH1 N9ťoW12#XP1g&kpՑ֪-!;.~pھudfe@}ˬhd0'%P fh/S*RHLB,T}Ȁ 0ˁ|ʾ"\HL3)g0$!%-SK6Hcభ+c^YH5pFBBLdu4(&D\ZfN˜9픜vvH=]}"4 7TG_y>oduxek`=;ګįUA$ /GUv`f•XfXt4bKc"OzQ[`yE=޲K ;Qo'W/ԕG-؈Ywܩo|.)z c?v P[ooUS 8N֙ E-Qy)UqՖ_T7,c^E k1\?4BӔ2BzA+nI>U44gșўٮ|f^'q]>m0'Cg%=qup3|n$CWCyӹy.߱U񷾨ޗG;)OSc+S0{zQ $/q_1Iqڽp| ßASBFu'\QTIF~qNJnH]z\RuyI%^y'NTpRz~JtL]I2qL%@K`u"TM`:~Aw ֚)W$6/`āXֹ/"J(e@X Gdi0sسq#^LVotQ2z(gШy0Jq{-܅;E=g 7/ -ʬ^|]oڠ6OtyY '0LSK.Onĕjk8QEJci@Ag%G6(`)$$+gnd>ufKLw]3/S=Ho6!Z!9}(j5DkdqSԄN?i;$K{wVd\LR2-Wxap'8;EF1P˷wC3w {|k]&N7 ]Y:^8z$9Gɺ^ϧ7=4c~w[*Vd)\&ɷO#D(d "KADoAsUQ='Pd|I9pD*LXz쓓SdBd 1b΂wDðq9cU9=LoWݷ\uw]πX.o'S&zج6 ZuC>mVPS]SЬ333Vf'fP-Z^?ZHDZCzБ7l>ȞN>d\v]KrWYmқ} NO<4= 3*p!4Ǫ}@ҷ=8Fkxs# z,κOt›T9p%TQ*"ZYL }Plps9&ywNK ,=:l{:K O' ұXDqR+9q։<-/I>51670W;wȧHN">i—0=<]ԋˋ,Nzi"=V ˜.GM&T,H.\{df,*s֮x.o>AmtBY+ 93! iI2xmBe2W 4 8V$,XJ)F$$L!/3Dt^%0!y,NVAs#,&vABBc3xel39`-rkЀ93>8р(lWb4Dl)w~lR|EE#PAIa9WAz04k2"?m->J{(s?ؐB[5 &D=ƠW2:2_nm*RoCb^AiQ[XCZ  ZÖUq-eĶdo"R [2?-Fm͋|9x2e͋7ǽ%̏]P2Z-Պmk BĠ8/Х`8lb NX%+jG*%BnN]6ܞU >s@} zi2mF bJD1bϊb'k4O9 UdkX'}A/"@z*<N(T[W3L )T&PI!(iϴBV\s?bӊ3!8  T6Tx-bPf'lPl`" *ZJ#v l1T-Z‚[t2t1X*ĴyVG4R 䄕l@_\Ngf }~U)6ʿoV{.I%W=/"COT,%Y:x=ڼ(H2I ٠zԑZ65)$x [d@db\. &cj8\S>?J-nG˖8`9e fUE9- P CobDCLn Pu40[VbAeߠ3eK[6jdDz1%S MZxMślUZ\U*EyBfbƘM5y3'A`IJ+o4OJc*)\ 6`&Dkn[>o>?o9LBPpWkPD,651..FN" U2( cAW'8fYo 7)SUƸqޢ2JPM7UZ[6md %yUPtb,ۈ"GUY2:p8 ;DG=i;m{ض{J|} =]ޛ.NEZքa5Y"Ȑkć.̫sBl"BtNfRUwYm=^ѷBu<>9~,<ۆe+ZղΖFh0;ǿ[_89z6m1<_#05nÿW;!7BX1v==` oUƋ&:a]qxuMݕ)= #` ~SZ']A2<6L@jW<\[ZJH*Yol֐:d_gN wz`q6YnIx-Z|Gfg V ̱J["Xv=-5{&ԛ]:E:sc(QML`TYS"g=BBU 08 5w_{68 _'#uUPE֦heNkfuZ(NikrQKc())<R;k3n1]1, x4. bbFO$d &ѹH.'C֐k$&"8ex~m`,1#d@8@(U,ײ\^9㧚ը:=1|12aYz奖RcߚwT.wUn^~裟$|hwM%J`&6 '4 F7ɚ<[#f1M ~tG &⑟N0r4 Qf~mb%jU.0rD0rr $ozIϘ~O>4(4etI}m?kg//>~0& B5]ۏqZ'>`ly7LGklSu]Wlr0'2>q&[d$MN'?_^,Q+ ՓI{ FpͪEMyrO(*O'/ AO<|r0^k׳:&M"R^HX<֦/ytZFu8fGlt:[e䟷bMuN&ˣxr/8_~?o{q~o%pc]C{޶k||EmC>\[`Gp&D=t nNқEۺtXW,&F'pzFi+8l4m\ tQ樴\qsjՅr fArե܎-u4q-lA}Ah@H_L\j(Aȋ/ \k{z\klp}-}Dn5z_+[4yCнPq!ӨD 81]dAt"oC0m@A s9E[^“12cQt7&|ISbSY*rW@u-U5evMA5>Ұ|{_b:fmXCž#Jmkbc٠>^%np}/2{?ߏ{fw0MXdftq¹5L|[ +UnpPS~>]oo[v`>SVaguNoL}Zn\ 'sR!8&ef@J&Jbw*hzt9\QkdӨŬR}|29`Ha>q!p Z-.7+\,.11i /,(m| V10)e&p4!j}mk@m6aaR,dcdc\=%9j!Wa-MRVf53"֊0yrHZv`%Ttw[]1ߓOyw#`tx`^ډNs^_ԏ껄M)Z§%vݯz5ezoPrCaÇkmV:w{_ rЦSnI3_kb8ssw7xwfvQFl> 6w6x"}{嫛77>/E+wׇznlNWϼovdYw4 *pUSo՜8KFynB| t >n%k{vSO $ĶA@m#PhiSnk;/9&:Exp% 77th@@UjTmSy6O橤/UXTm͓Sy6OƛiV3v\)|dU}h'@ <ǾB !{!d/셐BB^0&Ш]8LUzԎnL߷Uk7*XEWP/ hP~hiPOq.rt5Cli=Ubq8h[o`٨݂GizLeVrhj*q'F)yQ/u~ Eb[Z7 %ꎖz r4pcB BkDA`}׸!L^ci Õx!:rQmʠ6!<f (#VS b lTXR]jfzB_+ r\R`כiH4~^r^h@gɬ.Ձ~[jEi:nj6zs29]hR(ppsK>‰OrGS'yCTEPw &T5T22:]"sG$t,(vGEwo.0 v# r{+%R.>lnI%wXɞ$3k QE&o YWVxTZqR>RL+Gm7 ,3+KYʋuh]Mo;׏=MoOM?M5 F#L`xN0X/$ģe2"'U?PVda i]r ȓbDX r6N@(59A$Ċ{A6S+/!: 1)DXҠ8KFbҖ6&\,XZcZE["#E$/l;w4e .bh=Uc奬-GyU!4 m.d䑄\hX{.wMJӕ *AK}C4&7)7..& enbfT u\٦]v599Qix`,(U3@A ƙARLJMvў[u$% c ͔6w<4!)lLhNDk(dL% YcR +$(6UbY^H2{&ᰦ}-b0a0W"bk 3"1Tl._֨We'-%57l0Lml;%@=I]4˝[IOvM6 qDRrI-ӡ~nbC Ht '-˯7[N| &[y=L6u'yH[vsvWLU)VMTeǾprݰēu'/TZZ rE_*כ]YKPԤq8fJdȍ~vy޽)˅0'8XINhSjd( #kEZ6Jg(4Eft.ƾe~WuJj^RӔ $5BG&; T+50 lt[JjYPVHSJ(,:8fU=]U9W՟|w̴;s~G^8ZNyZX8ˌG6*!,7Z5Ahy=`P䐤XHMLc1>ib Ms g&I.>{n;Sm9E*4'0 -7]"u=sCһlv+#"㸑.HK|TAxa"j0v\Ms0alޘ (WIlB OYc*5(K)µTk8zW_}Ѥ!hKrJ ɠR!XȿG՟7un@a Ixo^]qՏwg/KN5SG +Rrs -4UD9xhŌ3`ů8HZ^#J{ ?uְƈ'4^[:,})a{a#?L&e%1 ϼDNxRX CH Qd"RےB RZ&J+,@i^$:O@z"N IEQ2N2NZB ;Z PC1~!svisͻ1@k? phb##VS1 iݳ2ǵ G|U-E֩DpZBnvH {P*"'O=Ho8P+s]ڼ^WX|lJyurY7&Sx y) M٫mrừYuSO_σ5elil6=n櫛?p`ot9Fn_Kv޶%YZh5%yؖ$3%4BtY V"Z@x4 |DBA-p7Gרj:Y6 bATol큔u|:RK)ߥ;Rvp2{iDwcCB !%bA|B&bb_+=Mq]-\i w荮V +Utg5m8M8\&rt4>oU51N&k0klm%5MU瓱語\.$)PʍR`jӮ#"( -J)OFRp8zUў9 @CalyeUҧi5UZ9 X7pd  |lm Zmn c{^GC )N^Yru~dqz6nYݒVcpa<HX'37ݭ~f\Q'",@29]!C0!B*C३X1c2b=6MVHKD6ҿhїAB,U- 4U;*eYe8u4@fIƳnJDM 0]QQ8`7PNproDem8M47JZri]5M!\)ɾ*3!ž{/h[J:X<}ni=zɭTs͕LP=<~)|V}sʡeFY=P2zEXp=)Q[M*Ffd j0DpR=c찭x#J9/̦B );gޤF`ë}nN =6s7ݏShf瓚4I\ƮZu댵kk͕l[vMۻm!ZnxZz3Lhws B=\X|sl4wJ/W_;W+ͭ+"B)vm< MAz4),*gh=%n;9 Q Ƒgl8&NR<гp|i׎n&橣Ԁ%]ߗ-'Nɧ`(LQ"ty4Hdc@VFK#Il01=fAޙDLj*E%|ɖK4s1:z[LӏAo=M>\LLEmzXf?pnӠ25%!' +SI]_cDZM7G7{sO$A>sڞ,0w!,uג!؞0}F)lq1𲒿{\=by8\Ҭ{ m#VC=N]]WXI݋v'q gV=;&*tS2/_źTװGҼac{Œ},{:su="iB{ҬBS4e:UiPcݐwcrߘ`2m|X ްWTZ`i9^UZ}O73.AVaț$wWow 6BR,RXÜPT`%:WR+ܰZQ\k">PB ]m>\]_>0h쭫h(qlZW4aۊeUA4IŦUG)HU>tC(E J~>uxʏp{+K+EtMG]ӑ['Dh$zͨC\G2&qp,6 cƵ:r&m[ AuՄﭼlMŝypJpU~ V0 2e.AeaLi8U4ԓ/q]O!KjU9A\y8]F{*%HZDEcxL[f2:m)14":cȗ^H!qof6YCS-CWE\_@d'߅9bG8;"z*DZaadoBgBy87[=[nk`25t,Ske 2-96+^dօj{og^^p[y0AbBf<\I$ot-6MJΥ 4*Մʹ_><Q~gkWpȖjڃ^1l!*WM"7ׯy\}VAU{!"\ԙ1*?!If.2L€h$su&vg]"[vC'~4ǐ.]9[!q/ Cvr7 - bvI`2{Rǡv |<$b)#B4Eyy}y;jj3M0Co篝G BIΉp~Ȋƥ3Di.YDԑp"f@&!SlUBQ_0^F 2PB@oyrI|Z뼷 ,|aJH2C8f"7ȗFcIʶW=K)Jܻ,qZ}MUt*ɥs"e tE UI$ mJfe5}u³e oc@;XB[y|')2sHܺ5+.[\-_gJ',Ղ;Z dSQGW g]v?}MMޖwd_ x G* ԀUzetGᄌ>pJEIﭪv7yFKk{x} fLAGq5X}lcs`^]D“&AFFbXl)[ŖbXl)WŖbSJRȲ-bKR,.bKSb%-bKI)[Ŗbii)ܶDiR"MJI4)&%ҤD!4,-Mzgr>TW9-h=- tRYZoyev= g\y" ዼD/p6[a7KӓbR[tQ%¦aS"lJM)6%¦D(\jM͢*J@`=g֥ک;:QIJ+MK~I)D q_-WfRAbW pR!aN?~}_A7=+,92P\~^񡺦Hf8ߞ#jPȆ)(sr <|hN?f )FՇ:tbp5VpYF4-j۴F ,lӏjJb'qS6V8KSGL[fn6/xNT+kĈ6O5^_LNYf7'Ґd {N۽vKFj0v1i. 7WU## yHga+LfX#F 4lS|8_6zw֛Yj+GedEuZ-jAI|_3K$ G!w}1j,6ƿ٩k:UHl; U蝟"9 ǟ޼}Sǻz2_||58%VM&`~ۦCf] ͆Fl2r `\J\qobՍ2 #a ?T]|=o{6<47?iqj.jkUx&Wq'8mdRvog2}1Y <6 7VlMՏCwh~p%(AEHȘP'Dx- \\+ 0#a^((Immɼa}\ Psyॎ&Ze, 4/N)kJ#8J29EF)ERg -ww} C@bsnֶ t~h^mvGŢA⿴'%5DIګ>T/~Vc~}^GnBC$Sp+0Kvڏŧޠ6ҋWS>m0M,ym%q1bp0$BqUQ$e5;?ߦbө9Nrr]j$YF5N)jߛ8|FӪ}h0n9KA}FBeà7(U.ϧz>~ekTK"]󗷯^HGk""[{md>52.򦹈fovZFF0VjyR˻.YJ-R˻.6d(9B43$0,Q|8ßԁD]jCY\!"I qtPr!Ҍ 68U$E6>BT.d8xKeL3Rĩ຾j:wCdzHh+i<_z VBZN&|sTQ>uk i*JJi< H*< i(E j~x\*7 hf Jh"pR X˒Z3jIȟ=i<,qb34,g:DGg"%LC$b]Fpgg4L_k|CaHg;fY C\9["juQg%}C=h8 ?J)Dac0<<Bθcv/ƣ6ñ nMz9DN4@"{ɞM`dwSⒾ}E |ً*z|nz(5 Pk QG!""7cH<a niW◷/N6*,?l57|$>-c98B$Xw1Є$kA=Dt{k*ֱF A j"ZX h$) ]7!&j,Gj# 2*mqs|fDtQR"jN)CP{ AΕGB/K83">'IJ ȕYXIHG yN1{V[UwtvGwS0M |5rho2-mВdm bHQjszY*--m5Bl9A/+=~>n73"\#{vTuµi~{?OdbG >罥B0F)RYU>ݻ(|8p|BEb(POH a raoby Q{J)!SwͬY+Pa 1bi%75-/ךбfg]Ө^#wU7<(NxxΣ& -XI`s%>O)3Ҿ[?nRyL$Y-Q@TAzLY LYNFxlŅ@iviOv 5+r]'pֈu^C!USWDܓD_R|;QR&bc`QG[6Gw9YR!!X@XL@+'B3 oM|cc(2R c"ACDQejB=WGI-:T\9QΆP0q4;tihi5Xߙ_˞s/o:d{vﯣ>.oz޼$Nm< T%w{lG8 )Q`غsm{NmӺv7e-g6ͻ]7z^4)ޠ祖a4ot}Ś7n!;:Ά->*V&ٜ4g˟54bjǴ=}|OǕhnżfPսn)Z]|*_w݁N#"W.RH!^R Ew/{UwWvKзwvJ;/۳Zj[-džlF=|JBpH:w> V, IB#csc-yݺсh<$ɩw9$FnsPRQ OL#]׼ts8 }/Z5U:T+zrO&ꑛ{=(T]F$[G%۲ك˗f.:8`Lʯ9:E_)۫zՑs C#z`Z/sn|mت3mnGlGaȲoz9H#Y939ipE$VńJ&g/G^o32wcS>!O7A9?)uQXZQ ;gfNz<lwϙpfϘ=c14C;xό3 3ŭa|c MwNJMI[jm< '8|v2.,r$Lཤ.bH8*g kZ QO㮋{M֕v|sXW&4k)< QxΨ nƼKTdJ&=͆Z.L;uҝͺL:׶kr=4jooP1o%I;>Ok# J ).2 ։P cV9WN ȢjϞY`B̤ͧC^6n*M7TM/:~IS(1CP[fj8e(ur+ U"18om#%SMH7AK%JFc&y of'*c#IIMyE*KK@$u&U&:tw<{ɀ1x4#9jfQ`1[Nr}y`A orZ޼` |`:F]H]]⏳\ udWͱGG{BGE3tt?iBpkcY)NhW)Q V(oGlP"Xu(@FdfCK*޸UŸE2rDzb!+>-m|Ҙf¿^ނ`a^D ̔= >f  }ϼUwވB3VimrΫw- +8P=-|7{o7Kєͬn '4, i8VclQn gI;$ra4 \u[ik+R!h"JIkC Nzn=mЋs9jb[eXdFtt7N$D1%kAEVj*C} hoV 2[æ}\f2tHar~} ˛|˥Eכ(GQ Qy e ^8)%r{!n! eI0S iI,in.WIW(CYWMgz"!H))9P2gQ !:Ph%%QV9K47"K`CA3]uSULļ26DGLYn 1pv< Z:YnW-a4qyx;FJ\YTL'eIl QF srS{; Lخ7k2[م9p17踴fw8'cZa,itװBѐs#ωuEMVr%ŷe[/nB?oz 5zwl%0,4Xa+Ciхm3<;UoryuM띍f#=!-ͮDJnoi ad56@f6Ys&XV1gO^Q v=ޅDʊI qőA 76iH|9 ڛޅVυ2Ƶu=:k|(@:mBJRdh +(ɌŭmSAI U8x=teF$-l̺ 3E&keB5XRS\4i$sJ BO{!s8vYF$J.x4M^|w1N%R&is\z鉌Mu!Bؚ~6)ãLM9&B cEYg.ĕڼ z1QFɅ> [؛vT☖;/594_L0 .:.|σS?Fz`f*wAd30v0p= $2XR$kB@%E vtBXLG~2Igats4"xvF]&yIT #c)ĜDFjBtmβ J2}e?pi~Vofp[!Ҷh۟V ^@+g^ōZѐޚ>iv~}┶tF{|ܻYy11T镟 Ng| N!b2g8'ӵvm/WbqGigXH{$k9sOe.~>zr;lxHգokԮ{u-%M y |z~Kד48σ"6S'zة+:Dv6O/O$/?T~__?+p珴V*ꚂcϞ?7 <o~thzؼ4|ha&Cw=[ V;}>R0.ܖ2cp2l=?tkF<,1p6Y)oOu˳ϰ!)Bt cM{JJ>xGv&߹Ĵƒ"v9Z}d0 LgB,׈W^0yM>}+g|*;,"N֚(읠JީSũsPeĖe[xtvFM/B>~blG)>lJiQ6WՃ1&$6(DkW$W}sQ{2EX |Łwl=Z`w pLEsUθU2ήGtx8EO~] I#PJjQ^Ej++LB³УYI큣'ɷKb4?PoFIK͡|fKDo]*5_lIɧUYRP\JHRY>R&!:Adw;!ɱ]Q+WiwVNpɇo\W -cȂ:DD92#RER9&p-e4R=/\2xIML]L*[AW< 7tWaHu5XZYϢ=&le_m틛C F'[MA|c0),DWcVgkkz*ɜ; |xeg?0>] 0TgT hJʕC3ȵ8hS qdd&`1 $zR,=>YzP@IIHՖզS+Q^da5Vʲpp1roWtqb{$^&݋?mh6&GL [$jP!1,#fE-{{`"KZyL/bSh EVpd1HZ$&Y4vQ鬗|wCըcK}]K֑5DUVxV$’Ģۨ ӥRdZ Y$}7kب_I籞ǖ"1J0m$)BII.km]\)`V&4l6YyuozN9]rګ0EH+֫t֊gIk*aLmX1+}Ҟ4r` QIL̤17A$ߠP~죒>wPs7,I"CDjJ\ 2'sݨy:O y YU+X/ N> :mk›w]Nr&%\Vw]zf'G4 `Z7ԺҺ^=-xE5 l; P6v>΋ zr5=\o~uu=oB9<5OgCd&^QwA۟7ݴFR~T~\0}F^TVsusfeuݜ cuS į<42 NmH:aJhk88*oHYr4޵#Eȧ;tB} ,p>7X3Wlٱ)KvzŮfWziD 4iD"H+M$ɴe?OTw`()cwtELUåNWE'{41|pCB jt0U 焇nI΄,i4( 3Qx"M.̴`UD +^Хmqi(rFKEdAxFlϬ@[ `FIA 5Q3r#։F(v[m_zpT;]  N*as" I l-m0HX#-Q-KI6=fiXL:d˳QT86BTF AiA*J깨lYӀ)@). |1lnFΎ5/cH|9% X^]\]D|{np_>[ono܅OWcV@cb]ӏ_E ty I:Te )e,x$f_3IL^]d@^J}G^/d,.+d-Q$:'9%^ ZջXzS}EI~PǡFlS>^9YF̝-~$dӱ KlF?#yUU@Ҵ}[ۋ7돆p>WdsR@rE>4Ѓ9zV3=,6z{ȶ>ԭvoT |kM|F$`AͶ}¿yYsaIW[N8\>\{{$sQ0'wL #-o>;mk{t*Afg~"FmgoXWe2ɻ 0cJ/)y/N4 )wBc]hNF/ZZDQlxv g7PO'6Nry}j6Ā8%E ׺N%+1khy+)ރAVc~xʁ7JƞM{VCZ4yH{Ct꤃x 51XFE?]Mduɋ'04Y~93$Lfyr>Yx;?K:]*KVk︮_.@qs]L^TB,U1?M..Ӑ 0PczH5C˖V8G5d_o_ o'OB&^ &?dA} If1 ~U!&bq z?޶:?~ 6-O tD% hAϼFki^xK|KRϿ eq`o4f=PsR`d諔E[2e/&+ YY]RI$ب#:|Q'Db}# .[7s_/nY[}Ӕş.EF? w˿^,fz/~rA<r7BZHcQبllT66*ʞkFe#FecQب쇀>VHD߳kPN~Rfix&u;bY̗6wnCn޼lNϱx(] 8e*b$Ee(1xF0ͪl\'~LjKQŒVrID&&/U `VUStUU)OJuN(MD8ﵬ2H2F J+#:GkK;E"gaNpY_|9عf|ok;o嗋udEo;B]),e# Rj Tk،q3J9.lBXQ<&qS ^|L\?< M_ŀ.L矧ggmI xL![jE1P eA,YUiHũ TMc- Y[xBo9kH95v<Tv3Sk>[`/Tml`PKv5#CE'EuM&&\2 YGsY$vɆ-c]N2*SшD40ߨhBY\a,~XE(UC2dURb'Ъ`%;b~ö'U?J(i'/iOZ [--*H*jœxTwڽǸl@# Ŕ5F#,uaIBK1bl{jso >DtvZll@Riy)PN,<1 x:29f wվz\pY`fL=XݧnʿE)J]w*^E{d2_*mf,ZGw KVuj~SuڮNiye3?n/;<[9Le⡽ N^Xlc/nv Z#g;W?{no~$h4*cBxD CRuI\{Y?mUʓY ڔ*a55{RI6H# &}Cg)%&+DZDS7ZXOm-MEKdb"E/@襴QcZVzuD3rvH`zBdrV5Xz:)ꕙW~Kӗ\s9\Pz{yD(LS~t>8%;dJD,.}bJ̓¸YD0Jn3~K(gQ)K b=`A"4 Ȥ"\?A*=(AYP`$[+ xpdʤE;8"댉*i9;Y/W>1[v/#."c /ޑ'[U =J50zo"D`mƦAOI8gM1|@!IVx/RPĴE{(![ @)j5ZjzJ/G? MdP3ɀQhE)&Qk2)SRV"qw8LxbѸp.fZ(OTZ@PXH%:@FDJQbdQ0m.FLf,5b ;q!Dk!cusIVNH5NTuYJ#,Nb X$3d ,60Hik-K$',O[mيԶĎZXTWbZ{ƭ=a۫|G |7g\u6)&rB 9D8i`h„m=dh͂-$.j (ub Ti<ؒY&pݵgz-]6i3;Fa" t}dX;TJ׫n<}MNxE.^4 :~rL0A)ML DLx.qЖ eV1'XtGj@&b[K*|m)uݸwpie֮M,V~fAu|^ "qFYj&8S1ORD%cbY|| ?sZQ^catSIe%3ka(_vpǠWgl;wǙpfǘc1Y3VsP7bZ^mJ#Tj۰'>-i6d\i##')wI͒\op9HE0}W#(/qFǏPd5c,˭[ND#m6Ũybi&i* Y1Ak%TZ#A˰ ǼoQ)zLPewWk&mjNPiAYg"@'}MgѡNr΁PԸ;qx8mq҂o~_ AxNekƭ'+"27Ρ&6WMx0cr L }شƦnZptxA)A'mC)${ϵXi1D>Wz%02jNY{K&?zM:|r@BB!6N& DXA,>rg<p243ꃳ, 'R\NH%6hBB`9/1!+9Ad2~k5q P|`ٕCɡXuW6Ѩ܏F[6$, T&d)A GFUzM*vyZWq@7)xGĈ &:D9*44 M$ *V%?19ߜrapGw %<7Nfhٗtiub}J?dzyD Y 7` A6։0O`f G-ZG]™hep#fRt4'ۗAqLi;Chk'`梼iU{-u`{MQ]mWt,K[/<\񼥪> Fh9Rr8U"aU(};tWUv 2I齒,p Gp=Rqr*%OAAp4v$I$5*Xʡͥ$AIN'F(HѶd[$Cr{ һ<]AHp}uB"9!j "\WODJT1FecBs|BKähD.4 16 f42p,Kv{hܘQ}1f 4 D'&l4 "5+r\I/m"$j0mUc{crTľ"?yYճ mewΠ#wQ ;xFNc8+qI,kȽU9e>x.Nes Bw1&DѫnPTdY+n٧6^DڢL1_ ^{zRLMe! -˨b#Jꡠ='jm+-*餒R)I^;U W=2'{k y{ڍhy"z舕In1aNV:s]81|Ef]^[1Hgv 梋l#~0\05x]1^^U0h)˖W*̡Z NҴ ltd"wWe \ {ˢ|&w ޕ2rw WjZ5I|K>Bq,bbXwk_ZǓR{z".Nq;7o߿??>Ǐ?~1e95j LbȡSHdk~ێ7vkFӵjs __J~ob>Օ1 3[1/޸є[N~\kg`1N.l~^Y'LשRbA=sDaAx&GV*!W哸o)lMi-o@ G=s#(ACH_e50&%KQBiM%HzloC#ʊ}GڞvhU.8DC ,..R UF:J29q!AJrW[t$݂^hiuzvqWuK-/5!T[S&JD9]ۨhAK/qM@ie09Q8?K-ǭƝu5ƭwvoq* Վ h`0-fgW?gwY!ҠUD|i$*P@[<*}aatQ$)Ey+# U{t1)ҪKI2_tHNkH**JF(V yJڮ\9PBK԰˾;q˄籯 ozCu2 ]=A&Aq&ciڡx0jz< ).< )b4My 8?CKj Ey`zq {+/'~]xa1vï]cX/~P ]ŽeV?*/'Aiz  1H[3yws0b^8} QUEۼ,g |55i5|WУ ꓊v ~UU(j_l\hӼ[닟\Zaś_~,2-SNɺ#  '(HHSrlTsM߾z!SF:ZF*R)ޖI`^G* ͺUncM٬:y7FJ*Nh3>¢ZqRU7 74ZQdUXn2qb {L 0A&4~{*?E}>7yr"˄~vۇ'!۳_/Gq+r8v;jZDD/^O&ڶMxtnN)ӘJatP\DϽ帺#4d9# ?)ƾ3Y.}>*M(y-XYtHY{4QE:dz;f T^UB .Q)$w' &+Z]s'&s+HÆ,1}*qlz~,wO ,|=Yz+ZeylϾS kmVFp y =nFR>d,B,X{Z*~-vެ>鳿| ZGjpM$ HՈ(un:݊ze%sIQd ]{ ;y!o'd,c= n~ٳmGݙV~8Kgz)x Gy >]S*&煗|pf$ϖ.ȥߏ9L$ ItMVV8-2Hv8]H4=u)(0?Yѐ $-;ύ&9Ai_1-a"jU}dLj#mTjc;<GS624-j^b e*Z PT6e)JժkX^Tc'[liEZ)/\Czӵ+ߺpAsI1* ٤R YZj5'Q֚PRkPDA*εfDiL QR*S).|VnZ,m1j\.?^RR֖v[!nLVg ) m&5!:hp- wO=PE06v"7cv[\\T5i)KNI8_} Gbw!pUV{ΐHKJ=B2HL8iF#y\^>Bڍړ<:fxyF9Yr6' b|jŇY$EU^܍S5|+DC))IVmu ;$ףI Y`±UG[%cKcuSt-J I25mRhіbe_'R|pCziZҬG;${5^uRt0d%pMNԵHMI#cf ꬱЅq!B Qޞ6TeGݎ0H%G@ GȔ#5>L8bPփ|ƂBgs՜P$rDMh k2ayc=yOy |Y%@ VZxQ o!cP$ 8ՀQh%᫈;+QynKj#hڃO7VEcxm[/.N&!z*`YeD=bM=A%xB>hAjx1P _@1`rYwk)b*HcJɡUf}J`@;뒄 Xś"A[vgUqp$k 7VP84jNae+MK̀zAcreR^H,ߟf`J$nÌ g9>X2'ʡSat9 Lڰw([0Zx "i8o!X7gSMhqid" J.YxY 4I!dS(]p$D LBFt6b ];S9 P GV/E͗7;" %&`7 3, )W0_-9=׿v~_Me̔ZMI4~mCw_颞K7bW`~.ۦv[ =='8w8>eb9Ϸ~>,E@ ߹ qѣ ܧn؎kEnG#={v;EޟNGЍ:Sy …ɴ'49 lh+ws7vGgA|U`JRj hLҔ*Z"djP8}&9ލx}[_|q&kwon\/>&vTԧ {L=n^ ͙r;ñu*q 5N$WTiw92s)JAwd%[xA />t%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP%XP[%p(;S99h랞{?_^L?̟~@6Q$>u\dDSM9J[QsѾ}en/>ꇶv?¾%Wl_K2} PZ&f$ xm%dJ`iAQ=gsUlLv-v3xuNk Epu}imė7-VRduNs:sGxvlZ*5Ud,r!B =4mȃUfh:0s<԰/{7_yR˯[TO҅r?Iy{Quzuw}*&ne=~A_{ޥO<vpeo^/, I`囑*$sz+Rd ]dC𾔇.z`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)`)'mR(-5\``o :`p#4\ݿNWt1\fpq:C][cJMW0IUYg Ng?eɻL;g*goQ /~H>-oN껋\oۓg7e{ۆ"?"?NU6=^>iyh,CWS_NoܻH}e}Wl4E(קW7K(]~dZ;D$kW{$B1BLk2W7# 5ns·߆OiOŧpX99@`umZ8P0[O \ŧ'_p\\o=ɁɁryYsՇ,g!/|naqVeJ1ͧ8+TmG8A_~>vY5e{9F_J#AsIc&Yi6ݴӋ9|Iڛ6i'ݴa{ƞj`sիt)L/^(g ?D6&<ۄglm³Mx 6&<ۄglm³Mx 6&<ۄglm³Mx 6&<ۄglm³Mx 6&<ۄglm³Mx 6&<ۄglm³Mxɓ6-6ο&073ֆgyɏ8ٻ6$U8 #!8:M [5EjIʶQDrit=W ˋ6ewŸg{No`$0Q Ѧ3NAn'Ba!{lMؚ4κT_@J e! gp8Fq{g;K*gOW\!Cb_8B j3rCco?٢x݋W4#R^EZ//@6# MOޯWu.HCҞ`nV$JT)%Ao h_)DtA^)FΖ-< SSrFAcY9 +K$:j1@*9 3qLF:,*wvc4a {RFAs Rhבp^LmUq uvymlgn^G["-uY[!T I2p+R_K/UT*R_K/UT*R_K/UT*R_K/UT*R_K/UT*R_K/UT[ Ij!f0^)ɬvZ$(<3/. ȵ/. J. 4.[gBpQ t&NjY  4M/HQH6%gUAn0IځkHQ;mF-ᇶZ_ֵ5d]J_XʔIszֻ|9']zUr};e bNJ\ۿu~$5%btϞ[Ƿ/}ؠDu'gr%Y]+!U)>JRkҰ7qXRzD5Qw۰ȭ. zwpyWJy6o5J\;\<7 \X|0'7>U:b*I6N=q#?\QlL4ݛ #چ~]y^rz6ͪavLKbݗ0gzmWDh \AtIwIw: =fui8j "jppX=醇l!Zu&x<2QzhpIr1=ÿݝ *&g{qO >懷?wO>Pf>7޽4Ο`)QT <Gywn7?o{k,/uk܈mnJ>|/%%w؇Qqԭ3K2{?_~~OfoFxڞӵ֜$<  ST6UJp[b{<,H}@e0c;WL_8›ݴ&vK$ͭoI"m:2ě ^K=ׁe#b0/BSER6,d^ӲϿ1 sPGZ2pctHC}4/N)*AH#Ae"9ԙ#ogvC č@:0ovfćh]=h-r˖"7awjR=Ie SM8m,lfiWG*;hw1z5" zf "5`O;Ӈ{TwfvhO;Ѳ)"L0F)P(|El-*->#`cyh? Coȹ?!(G2K8i/5X~nsYCLxjhoi] zft5><&欛@qrfm'm ؙI%#ˍ6fxTcdԑ䈫h{399p;J*wZ=ٙ;RΌ"y8*e[046RR{-p uK̢U,t[,Fv\.uwZf<γW.o뉣&nM8j0UfSsm.j>ʈz~9dzg֓4aofxm?|()6~ S-יszSp|},Z &w;x76$yk|%Z?(c+Wx?;G!9Ed>&9xn2l3QUB`)l՜$ت\8ĩnBKt,#~MA݌AqC7jPRSYThc*46#Þ) bklGɷ^TFOD™29ʕ$%N2IˬȔ3K{P㚦D󂒱9KVʳb;cWYh B[e^e"|%$ד4HYb$Lş h`0E88ڬ4m\3}0"Ʉ ͝"0J<#~xbvɮr*Zd.<^b٤#mJݭ`y|[dygJ>—a<>SDeCnk{z h˭xmQO ccq>ׄYdqtPGo;4:EX*rHdk #㭵C,ΓxKeL$,KĩtrK1r(\zH&X-_9u†]yDǖV_⃺CZg `rJiP[9Ie[Ft'2ii8  r*TTx {,B*CvZo1z9Aܠ;M 4e~Wa5$0dPM)D.2d3N4{~Ka4'$vھ`«W/[7ywa)|4z_F?'EBE ן~xd:A*M/Z`oexOhgA2Nf,'w-.|V'J=CCo&tL߳E^.w avtv e\]?IG_Eo6ʯmڛwd&|O7ԛe^>Qۆ`mxE{ԃػͺ$b<ryܛ[߻zMa{k~O; Ң ?-6Fz!r6:-d)'M"7"xmbu<A/H/%%&.CtR+e< ,M(bḲPv^<;wW~`Nբ ^_f?, 77JOcBWnKN2}`c*cj W*M34}V|mwhϛ WAK(ѧdRauVJ*05Ǎ-5x-vؾAݹTVjkP0:ń' ^:/UHlJ Lb4*B#(5|2ޒ &%y:phLq`q.Ug|=*.5l2K vZ ʝmR4_[L>jocAROrD=f^ /*C Qi's)ERh "KIYj ,AI$v]+r6 |?RՔPKJa* d1yQ(R]M .iU_`_P>,T*H\p,.DpG 'B !'Q,dN_r2FSw{'$خ KZhl{rAh3*ֆ\AS4[0K='}E~eYwZZUhqy==bUi#AQJ=߅ȗ%+= 7S_f:Z3D!9ElQf(k1g6e }njM`'PpF| ?.wii!*#4fdU_H]xgd56z.5XDh 5^(/H4cGǮ-^$$[qqƉ"2[]wSsšr"BL)h/g`y/щe'u/YC774>UtuiScu(uΞoW"OfҼ >z#QμV+>N/?fn0^diPyhnq:$|G™*-q&Phg&A h$qV t^!TTSp^%U)άT!H\&S HУN! g-|C}4(ԝP JMsZ ǫ;«|t|(,hU()(Wvy)(^V1h)Xc<錾Gty޴N[`iLȾx+ȁ-n 6khrzܸd{i_)yGt(DKP>%5ֱ `d*ʤX[Ik*-/*#5F9C,(]% dw_y%!0⯅<ka݄.*[򘓔9[ qk_6I 2dU4*/B;a PRV2ZJ6KbiD- `uZ6.(U u+HBA*`662ʒ2bfD!FL3#LP,CѨc1E^{toěJ7wquQ?VF陏'_+S LUgAT<ǿէ+]*,;*Nj:>q71zx$őC-L_0QWRnB @uL Gg蔱t?+OGyDeR K> ςL6 ^FMTMyކAy0O9 Wq1VWZ-trrj!b4=]C.̑dz%ۉl>k5f n̒vn8][X[Ali&U8EK߾yRB|W7bWzd5>h7BW+Y6XM]DKr 4|$BH?C6IF*(e%Ie(k'MJB4ւN_$}koB5o +/yU< (dTJ~a1lP;ILZcɦ4_\mɾ+h'n8nrkv}C+l4i\?U_M3JW-亃 _%L՗~Q2з襶u&MܡQPȾϳ(쳕TсUYaJnVN D9b*LLe첓eh0+ɿeDMN%hVӁ {W }RVdi\`DA9 "iq@9AEGEX"Ink"Bj=J`b $IieFAc@&IvwEUvdo˰we=ahM7F^,]%Dj%G yb'qZ7?07^r;܆QyON{9;.F2g6|D)z,\1?3Oh W=VyvGxe0#gph4VȂ֒P&FYCٺZ*iYgl(g!+^è|z_"ulABAMTeVİR !3% +ޤEMܱy"S@Oe9$O;>V:TCO#>}<4~p~ N)%  @uX*eBFƠ\aF+(A"j:F0*/UbVN;]ed3P̺8ſke1Xr^A`p_glqzނ~F p?UFg| q0|`0!CmTx}̓mū1|rZRnPG @Op"m飶ěU`̙R18qr,̦bamCz [bLԶ]t2yX/nο}qo'* z_gĖQ@e69xlFȂc(p^İi̻xhd OP`  +2PЁiSDN܋ٌvsXP;vl.>PY1#L(R)u õWLA[dw@fIV<\HcllÚԧK]0 "fӏ#AD "D\{ h+6!-EL+#7B,^2Cb:?)$n8XG!q}ctL8cu4(bCR#1s67#b6q6#u1cufӒmqdE\p:7 i TD'D΃%(H=.pX0v ny lvɊyᦞ#[isK6띤H-J:!5bD8:ϐ6Pa%TŢTcFY{yD]ޣ:U߶!k3yHwzũN:tZ (bXi *=xK1o~Β ,ΫRi/<bz:_紮3y4fn_u>IȵKȩnMrWjF u1xw,A JOICwG*|2ެ+FW{_Txs -j+^Oojso<7TM{8 `kNӢmOZ^-'qy??⣸^OkVb 0'dNR~*'i9;?iuPq͍tl= FhSkheyZb$K5(7`Ut;f qapt4H82AU@9C٬Fˆ^FcD2Y+냉K@""&ZH0<)ЙCijZ*jSQm4aeVv{Y=:8_\aD\W4aJ aEIYR+CL 1=ịU!h@QAsglrD`pC;A Ut(wu6q6[R?,hc\u W t t-j^Qa i2?־kX <2Ra$^TB(Pr0AyzWGu%sJ^ h\  .ePs+7p|;}mgL %wɇ;Llg|{z\I6U^׮ți L[Ootk(m Wѱwj0sXVQ ,R| +G'Hk!LS Y,-,xGI4 GrM,|buSaE*UdZPbsjRZW5߯hO0*vyXztC|Z3}/qo5tM;nT֦%;9{ZmlyxV팷1e&jlMc|ktZūu'47Y'!rx'jzn:ٟ*wEʊ7S$Clq9fHY.9JBFOB0nC %Tt(uM*Ҭi;.1E`a|w{4gMҍm֑oi6=VuSH9/SKR1ʞjx1/*,s2J*ί1C=J*lPqx{$~L歒yxJ\Li# i46Pt48Aipja h{g,P0vY4a#"m <<'368-q=h:7=k [:Fݸ:W671Wzv89@ޑyGbo? QQJtv0J`D$+hsT:G`pF`8#DyJ{(]l0RKp+Kp 0qb|ib .]-x*|ofDI%2L'\# dH0klQ de4rŦ S76;!4MkjҰ";_dW 6g^3Ŀ-&'a- \B?quC8*q‡=qubu_'%Qu0{ʼnì8K"./jw nps3'%\L"i Ϗ0z SIɽH3o+A#]U6UF~];(eg2>nԭ;LyޚaHe9VE!J`bfU$IZ_~l]S-á([?MO/G^h.y`6]WX1teU$<].J i.e,:uu忈Qh#A= bͫ.ڿskXB QPK MUٿ&cW "uT6 ^~]Aa7:)f/!lojzZ5&Dw/`p1i;߀ po5tywQ~tp[.aznHF1EZ8YT}dvPK~jNyѐwF-U~h0yI}PY)(*Ce[LM[-l;9%r S؝$)N ;$B'Wf8JvpdG+RL27:v?F9W"p33˳H9cJFB*FB(PrS':5=Ij9<Ɍ <Y/u(Q)BR #&+mhUhH!"b1h#2 fg3ݝaK*<~;&ګ@4.nFͮyo:U/ Wnr9<'GU/`E8 Y438("2"P!`B3!Ղz]~yvx;[Ϫotk>͸]z_Zt?o:Tg]j[΂z[bT MVbYy7N C$؊ N!r 0 HGLP@ƌ9$%3T (_AXD:tB,,Yf\ ̊\q(zwV{TjiVHOI;YQ@F VZI)94AEͩ0j^$Q(GNPq&`j{vp?taGY /PEݾhƣ _:x[AYkjiiϊ87L(O$Mi.i*bbČ&aj VP45FN 5{̈.M/0Ia?[1Aӫgy=7kÄwzǒ8(`/W 4[I`_mxRIJƃK}g̤,~z0[u"{.TXzkX*8qc RJѽ +@lNOS3? y.hcM*'cBBiU$4p9F40Bh15c<8]3y*oϮv_ NrYgb Σ>9%b& U;q O{ݮ骼TeӮz׍b3ͤ'κu_!G w<5 jѥ ,dUݮbv]Vꮱw"IPՒ;!IbR> P^8/O^dBsɤH%m2$y,2+m -<&rBtiYɄvh.9C8mhNG~ͥyjl@_(GzD/a<8v%Ťr{bėܾ!2z6snQ{c|Nx 9 셲~P׭c&DK{D.KRYzkJQ Rj4 K0$ <'`IUj<|6$ew$%g!)Yզsl<'DuF2ȒHbJx# 6!I/p(hf*tBqDd^["wG-%"2bsY :q{h|\єm;%Ȯ Kid{0+h%1261):hȌ0& ft2C2vwW[JpE0 n)ªܙhHS܏N㧆8ϛ DH FYW24hCwnj) lj=KCfMb6QffF՟kvze0k*R.]K͗1z :>`o0*6 wN,Bvt׻iV goe~n>4}.Y-gwҀ¿xE`щٕzY9|{Y06ml;6WbʳY#)F5ޠhJh &u2X-Voˬlv7U.8L>ig~ǫXnfBx ܫC!uh)i>{qmVo m]D^Zoc{AQ >s'='K]S:ᜑJSrHYs,ڎ$ W f4)(ղ4xq¶;H!8P:% = X"qɊ'۔gև?әx]H0 a3ZȠ^CdnPfgc-8kOp1bJ()8& KETdibH$%d T65 &+UHK h1,'BןUSj^u+977(젬>Mg?xiHb|G\e6As soz~M~4U.sqf? AD$S{7Gwq);",&{~<Ɠ0<4s$r}d G&hQw+9p 'L=O3ީoH{EޯB'3^|U4 Fd'|*&ػLӫ{KlCuNμoO|wq~<=n&!sbN0GǓp5 Ap4 #OocFGRN ##]:"| M8fbr61'G^Q[=!YqY5_FgbQxo4Y$o䊛≿OnNNbOx~ÇC.ywwdqD3p,E-#$n n w?=thz٢4lhC65z:d[#֙ǹmHʏ.~7LǓ YjS&y^}&1?l-UIQ !V8,Čzd9>h2YLBJ:x܊.lFʝ$f|!M $2Li>'! v¤DPZ뜤om鼦?v1.4DN>yQ*12%ZA/^1LN9:{"֪[ԩSm.\0WSMYyW=O/ ɬBimQJE՜r&HVBEk,5`jsk$Mu42+F֓WV䁏E2xfyKLipL!մaÉ. EWT~.N5M ooL2%oLGLSѧC?r-ȍ/UgI _Z.6L G%,s2$QVF!Iur֚kQ`LЁЁb#Y@*:wN/_;11t's~q6O {IIN_/ѵ7#{=%)9g6=?kztZ4$؏/(U(X襲Yh +hV,sEx\m2GݮYnۡR?Nd3'8Xf9C]9f &L C)XՕ5Rj%xLtAnʄu,@jŹE5\1v5r_) \л׊}M1ߺ U<`Ll=o#ޒGVS]NfPgjN"7Jz(u7&35ΔlqU?i63DK}QV MV0͠4(QOyAg< (*,=!qlN{8򐸎C0!uJ3^yZ+j܏3zyХ|b(!HY8: 1-.U;$~kcwAPVC @LWWp UT$?UjЉIpEhJAkD)ܙSw.L;&n%k,h=7,@*/Ah s,&^[#,5IȒ(a% mRc2@H` pnVA縢k8"~DYt.e*.d_.JU֨ry6p5Kv,FQ9$\I2KV 1j2Fx-}q$6jJ&ӋN1qH7ZufKxQڅʟș(b%Xݑ<^Z!Ɖ;e0&FSeh]6' i9+eϬ`T:+ş: !s !9l@q&t%gRIAGXy"TQ?@Oo 3C+FQ%M 9)E!1}V{j󓶋v< t&a yp’Q+!:2Zۘ Td$Y ]tة jBؕ转};o3ֳ߲#=,F#9LI)e|G0F,S|j+sڲ٭6ǽ- ۤkY+y%X_'JY2]i 3Ŧ0f\Y7yq4/x֨PWuj"F8'Yh =Ȃ׭\Ş2 (ɵeͤl"hW>fp܆; S7z B^x-}ތ3OUz]֌i7ꗒ?rEC3¡`}}J%U':?4~a~ͧ*lJIsɹh(3DTJϼC?jx}W}ڕ1trih#7NJ 6LD#K |ʤfF$B.f=q-s6#YW |pZg^+.ʸ.6\\QreTF@N?)jJ謜 ^!"1'!"S>pXZ8<ܱ l)U "}ȭJ6ako띅H[DŵpC ~EHW\hk%FpRj;ɭ{!1 :fd"a#G?la?hNMeu۔$ r 2 j*+u%L R\7so6"߂2pu_{Y_fqKg/]oȵM`^_ҫo[r/v̠!tu%uu7k8G ]\w/H-;jp+6|񹳲ŕ\&-ήq{" 7\8"ݣn_ɝ^So5hm/Kmm^rŵiYa=xD&V*XelbMU66rXegMU6&V*XelbMU6&V*XemMU6&V*Xel&ob-_U6&V*Xelbo96XelbMU6WL}ԖG0V&V٪4ʗ(|4e;ٲ*(U`J#'rPJr Ϙf'[E>'dSlM}O6ɦ>P1ئ>'dSl'dSl dSlM}'RMk_#B{kaM쭉5&ޚ[{kboM쭉5& mc8#~>Ǯ nTY$*d2E*;g4o{ݛ~3mIT e@E#PFЬ#R ӥ#sɽZKaF,S0hiΌ;QV$f#.\e߽9[6qsLݽJ+MίFZް(><+ՐtA㢷L ʷ | 4!㬟Ht?wN>\~ Ӂ2}S#t/,mE`:w[뗦hw{_fOC܇,`gd4ϟ~̠0h r4xd0g9^Ytβ{oO ) +,{e2v&sIuGuIdl ݖ~e/{t^9} *'0M[~_knA_Cp%Agt.|4rVөhOOo`b| yYEK߾K4'Oô[]DdגLB.2|Խ99a1:o21"EY V#74V| D(}yލ04`80 Y1R*%FƤ \²sbIhAqA0gg]ƺn>32 bt$6"厹H#W48תlpg3%3JFQ >Y3Wwz5%>ζ˭)A-6ǘͿ`MV[Ϙq c'4)"ͫN+1<@(Z+1`b4)k>s&"e#'hK ]0Ã(j%(9d^Jڨ6oW۠_1XScv[Q82ɨ"E=sEFVK5@ yǗ\̷N@f|4  H',bhm0^ lծ;!p[e>lYz^jXr>Aލ)ڐ{u=M(٤y"{*~j ˋQ_Mgd^!lwd)F__/On1UrE6:'YD'ȭ>e160t9;KD0ʹ'Vӧ76޾fŽ ?%jrvWh2h+Kp^kh$34U,UC>67vr3UIyc+M}(ǩ>-@Kv-^ʲ^ RB0\NJPO;V%#o^Oz풕S֋ !$%51* ^Iu&TNu0iϙxXWyĒH!dR\`AlutxULkE@z x(%ٯg.IpX}OZE:C9$^8J2:qrl@술Ȋ{/Gɟ (e%kgh7$;̀Wr*6raɑ*X$*#:(#uQ0"O _b`5Wٵ*K_& f/-{ 4/۬-`b6~%rx%-=M"+-,o;gooġM4X嶶~h0c'mo0 WS/~nV#y2,c%:vTaZ jgw!,0\Mg}ߝ5ÕPY) 4BkN"ș6蜈IY1 nAk.YFRah0I:"S P`XVa.h]˜̈́/ /47tpsyY`ki/H/a:U ݋۩O=kʩ奷q ̘ΈA xɅrN&Ƃ)%rM P0%9D! 5)%#rE2阸rQƱڒٲvc F񌔜rQQC\YRU^s#F"wz(.vf=*!*c}`6c`$Ͻ6!)ΎYEf;q;ctӛQhcw@J\٨@'eI(AD&ȹe h3N1vDclBY:VXeI`C*U,-2M൷px#za vg)|騧x@44_uBܕ=1p)MuxN%ҷyJ_.5M~_eh֪{1Ld_VD&/xq,ԮL4#μ rهSW?{772_V$&,n}AiPH.\uo?ӉE璉7Urϒ-6{o^Lv^Oiiؙi7qD̪ !uJB<\ sY$RIGnG/.mE(m 1[UvS`t):}+YxB'B( aPKA Yf- +ֆ̽7m@yټGC7W֤M"b!ogһXzBu|L><$&.K|z1̀^Wg*TZE;)uey*-g 8r鎍g3we2y^b4[J8"MdPhؒ[twU%EF*%#Sfәy.TDD`5JNqHQ\gV3gAh./~[;s<{cMB]W\jNFUkPU'0uAb_=_oa6 i5\iG I,o ]c0.8~<%&O>ё_pL=}Onea|zz9I3N?Ͽy=?Tw2 ,+241X*㬟33~xӜс~YD H궳6:C^>tvChw{_fO<<=-`gh:?:W^NѠ,' e7ԒN<_NkmH˞H]}o~quRmH| }QGDJc#riNtUu_)a7s Xd-m)Q. lkU '*CKD۩vv0]wciq"afŤ6 xb3;2rH*Yo,jF:q7YtIP%A-y+47y.W`m)V ) f\\!+((Kb GT+z3weSDj+Y$*:kJJzbB0$EZxˏ1~!~&v{nff3i &k%&5X魅씶+4>+;윒³mg2^{;x cpeSBQy`ƕ^^,EdZ:\$wtOFzHBؚSDš^ ,;BD9"!L1QŬ3GpqGND?F٫ 'ٟyDK)i[F d]e1IiNzw應'epّ @/\ifL~P$~RX,)Ŏ80]x'<G) #j~m%5KU9>2#s:(QB%;[?3i!@X#0<کp .|-Dk{mɕ#oQ AX`O(rqۏqZ壍%gt| Fg|]{lxЯ+^N[f!i$l]튑7 ih-w7LLRi8q6pT'Z  =ޟp6ݳ.i֦guN+.xLhm4֗rc8h_o:+(pSM{^_v ӟᅵ)p{ޝ-8b,DM$8h8#5y[S|L|o1/6a[A^-uȭH.Y EQ]WesMgr/-BdBI/oQSTirJ>OQAT\reʘ; B.Rj3X v,0EH%` `U!CWDZiu⊨Ԛw5+J\HjJ][OzVd$ {L(qD{t64j|wTPǝH~B *㮾R[J4z&~R4M6 Z?r Er!NtsvvJ0p64S~ [mrOVVFyzRXQV1  (ZP":z0RTRTكWWW S^J\goU\\# rURuU+]V.|]AYf{SsZ-1un߇?]N>^N>x-|eX,3Zku_N`@DÞz!JmC7? uHRmĜ J+筭d h}TG=Ŝ=&llу~=yϐLhuJj'+E^_́(u& *F88s$٢( Bz\+X'Xm]y5|=u$8S}繵dmIx8vQdEf$ibgK9^ie8¬ ZY UWpCNuy*aVKz #t6I&eC /n 02 S3KJjiK̈́D‫"VPS<{ >5r6# i6ЅFo ݮ NjbOڀOm`-'~}4q7k<8Og6Ih!r?btՎA$AG=7+!N"D"<Ɲ`3OqnD"@Șa}槌L3,Dg]teD C X/; ai%˦/Bltwe6j5ibVXE~^ <-h80{5~L ,"[KmAoA(~KoLfC2!DX6,Ƽ12v2n( h )cIvb51~Gz[B#4Ǻ`1W;>yrؿ<}߱0ڌg;&a~@A4l8pcv pgF!h>p$a pGmˀ 71i;߱DMfN[0h8ַi BD×ŗYxmr:#_ xD?)r?z0"\"䱔0É~5y~͖{-8(^}Y؟{hܧ&e^Fm Wz$DZ#q26ȭrkI2zcXMpzyHyz]'Zu 7dߊ>74nQpe'PJ.O2+aU76+S hee*)e5gx2bf󟞽iܿAUCڸJU$.IO,dzNx"HstXruLXv`S"O' ALrB)A[OVyLڮc9j\Nl.F~yf(\(GzD/a($T)!priymlYQd~MhO2I>˽p҈]\AЮZtBYҗ xD%! 5)uQo B( raHAN%2FΖ;!H))9dΒ>+h.Bt^%,ypD*G ,m܈:M|eB6!EŎ+cC}0x g1@EUi>wΕI1E̒(AD&1=*(a8E'q=1bW^4jtl m+Ơ<})`g}Rd)aTN[r KZ:*!DJ㊔\o)r)MW1M?F6tB?k7[)^C(F?׮u&䅽 aֹHvۛ/3]Ae|`T΀zޔƕ#+ kQw۴ȵ<^14}6c mwp#'~ZM.,FR덟q˾%[6{oʬZ>l;LHspr7Wڞe+/_[rF'Uy** ]sFܬѷS7+{Lś.8L>I,9y* 0+&5'O "ז~f^fRI%E%7Yt.\h}YҸ1\oX{I>H ERj!%)e2n@ ʢ-m8TG; M˻BHS#7FRrS7 I X'Ut<zbB0$EZ<qǙmq̺g*^MKLk1[ )mW. zi|֍9%gϲ^{x cpeSBQ\`ƕZ^XKR$CR&p\$1RBؚSDš^q%ҡٻ6r$4m>/a. ng0`6UI<߷zXN"Yhewj,Y*Y Da9$aRJ A*O2'Lϟlg}j\ZKVYvwcO$??@j׺TP/U +ǹ귿(|?+:9@`f^ N]!}_MGT!SQB9aG ٦nx}L/g5IpV%ol$F:[&5yĔ,6u89Qo ԻV楫Yb_[Wݯ'o_MZ%f_ h8; sa? Fi88;-77c$FOWI rpqYגZtfV'[TGp(U|<^/isp1nZk[edsNku]_ 5f&5y .t$7,py<&VTe8G6 >ۗjRsYeq8pto^o?ۏ9Syu G`}ފZ'!{~=鿼iۦeRMc{m%{=-ڥd״0 7C\r[]էGA3[Bmw~ͪ  ףPWPlTYASyT\* +˃-h.o4KY $< ;qr5>&mGIDZ4BF@ pJ,& d9JQHo$=ayu>tѿ.|*=5Dh'0|qRJ\@hQ/2NMN]sQlNxe>[;O#vːۼ_z%\<%H9aAz:qM&(Asub9P%Z3+""}/Ӎ(G2ŸEX^^gްqx`qkҞR=aLyJFH ήagZUC"6 LmD0O;SuF1ݡt5A%JjI"R[Ih%cp|NXQ5)zC2΅HVL%eRLpOŠ>1ϔT7n"t} Pk Z/yCNzHSeq"n=`\ئh>(Sg +,-&Q;eeEPhos'IyfT4b3w< Y7H#JFX. m= L_EyfTM *TG#u` `A牖4xt bl%.er 2Ǖ|xs\O)7q]ȁ<2=4U F_TU5msZ$竏W%c%:OyteҪ i5$7?da/_x|bv F.ۼJ8x1\%0ޫ?Qh\M~2ˏ?',]'%HE+$%˭T&ԥg*|K%-3{q2/8ݥ&"h[g_yq}ٮ> /Ŧ&>ٌuņ9yQ_5d`h`O1~gz553=W?X)^͎9WٝTaσuD*jr20ut_kWsmM;lƧQ!A ԜR63НJK7{69 2ƶ`Vx4 O^@;20%VHx`S *5U6F)Dob !o3 RHF%S69k5I,llgs!Kv8_Gh&l}>xuYB 8 yCq^(1`Vp9W. D HG4 eD1.@)E/Ĺ[QY6 Zq"(6OvY05h:nלCheu99m*G,_> ߾;PT$ D!GR. @$EzǨ6zpV4EFU !9G8g 8l :%*I% ȯ|id,&iqbXXlf슅0  M23^4z0ߕ>\7~:l@mUa-!rl! ksF cCf]$LXCDH^Bl"PylwqaK%Om-L#9%.IXJۍGa j;jD=HX" "^"q2/x40nHo&N<3`(UNS}@blq 1KXD\ R\{ J=ou(םvP^~(~S0THcu$)|wR*\3V)zAE{$̫8'ɪ7p6^sinfXrqbąϘHLJY!\̲|ȿFJI0%H Y,W3iʼnU%`騔!\zϒ@t1*-k'ᦉ/ڥCHo% RxFL4_#x`J{&_;@ X_n}G֧ϣQF+)ö~zD* F𓉝'ǃimMm8.qRp$ 𺠄odʹ5Xj4{YMtu ם<. wvTp$ ­x!'GQ ʂ ٧4U{ʛDASֳ|B]:㕤ku3rSzΖ58w;\1t'Z,-"SG>RE(7% L:ܔY\f%7evRJ|2)喯^4Go#շ+MNMJ1W]_=C+`* P*K+E*KdW+Xsu0pP*K{߳aR*3+ެ+ս;.t>u::;^Z?ݬfc5O^}WMYр9ėsZYspBPI7|{~ mUr%(.p(02yV=L?G@oXt0lqj(f\Wp5gi, bɰ}: =7ΞG p1 ЙןwFQ–A)Nss%Pvc!X滪y?Q, 4O'%cV!{eIK-5rfL"5` cl ^B Fp8;J+TmwRozR|Hи\\ \Ji3JjE)9 -]eiM]n(%#=\=K23?>)B٭M UBWv{/w4E afS ?Q\x0v?Qh?~sHtONeq9;BiYJ{zpeP+L0x~W<6j޷~Gog4_T9`|Yc`.p'n7$7{FmY؇9ٝ=ؗ ^6SlI$,˴ݲ;A[&UůuٝкSJ%ʽѫ\7L-7*r\y=znD~7fu0r R=hBA A9$ @Q,J.E.0gl&c{G^ASu26\sұU3PfS~HGOMA\ "b .h{f+\+޵7m҂>}wVFJ^'hs⒓mP' ַڂ4lk(S*ƔsA0Ɗ^c&mNXCw4[ֶ#I]u ܩsL?4٬/Kgf(Jo[[.$!GjE@Wg" :)911 g*qf3rǙK< t[uys|Ң0 UtNGP6F+3՗ % EcAkQU`ir A]E ASB: 8`>v<n' hJj$i-sYg##d)`w`t3-LLJ;@RDOU12UJ$瘭m7#~}ݥ{#­R2Ios%q.wyrv?{=|tOO>y-F ^ q9&  uTPqrlIB(ԳEP4]Md3K*=2ǐ'Aڕja}m`*S\/ժ$NZLXlTZ6ҲDP`A tZ^ qǞ*-B9ktP ⥅u0+2!Jb8O_ 6Ke$Uר#qkziAJ[J_0L*H9:!҈wT9Vօnz &ƚA2!E6X%#LPv[ ENyѽk*+ECZW!;{TR #~Go͏ֺTX.wUVGuw/aM~?5 С?p/&ӼLeC5C!PSB%8XLJa>8<<>V;62W#$@›mlFMT>b=$}ZV~D>%ǫg)/3z8JW"kxz3@PGa.r5K,YtvF\>:t $ڏִ`q{u1򏽫+?~8M`=1 9Ofe:el~7D}{f;ٟQu33{uȵ(iNsY>f`}u?M/Wz~=xraqjQYtO~w?8T#בţIp ໩m-[MVڴ\o1/K^3w#uoNՖ R~YdN'ǧ5?N׻ Qy+ Wi9DO_!^8̯z|@|}D-c]iC27 :645f)hOF&YBROI6t^׋C=C[nw}>>Ќ'n9nq#<ٝ͹C͢kj_z%F&f*R1ښHUݟLsZD1Vz4*y%/'5FD,Ѕ$c$Gcc(u1Nke˼BK%FL+ K#/ >Otɧ`J$%fDqacZDMKJdb"E/JB`ƈL)>[sYu|'#I7?M$[MARH]-1G}(uAUJyRWU6 xvjj%W'RIkEh.d ”"xUeY)PM,J/X! J/Y2)ɢ Wu5@UY3r29670*_ʱ\wl>xG,a#I^KFPXf-jl6Q͋z =O4pFS'>abRlZ]iΧCbo>eQ$#M,NZe)&Ǜ6e[AX0QZJ8zW(A7WrD[E)OTI-"a+SN gF Jl1hvF׎_|(׌Ev>XViu[q!D+wBɲ)߲*l환pvqƺtrz9h!;J)//g8=t[^C\$j* a{8<])`)GEHNtD, >sJtX:LwQG%k>F4vDx%-26j}h+ȠQR:AmqdRg Ee^0k*yl+ȶRk94jtwUu#\/5dݭCil|l7ZmÏP^OVyGm.LYәe&o:0ر+;Qk4ID#{ SpWe§n J(*TKָlduZ撌 riF6d/(R0X Z36#~4Ӆ8cW]8XsWݨ ԅWյ_Td&dq??n~9O o}fmKRGɤȪQ[IH<$šUp ^d}vw ?j,Q*Yզ26UWHm-(gm aTDiD+rdoC͸cGĨGE ++Ж"k RI$NO>:+ -ʦhctԌ يUcYdb$I G8DHlT +s>ZlR<&Cшc qF|+Q+lB(4D%ay)E=rT@G Q`Lmmz_lF }RL19$C"RdX,`=Z#6#~WY/λp ٌKvՋ^Q/zqc3aTLAlP6,Z=ۄ"eFNB()*AQ/>^>lCч;zvUh{M9{V5j >4^yO"x"n~|G`bq_0E%W}Q5~/*N/` #k '֎)>k- #,u밤EEcTʖvp}>܀|aDe{'O9;4=avZllȤ)PN̠1*uT2P=ݏYM6{]x۹H$}Wwݶy:W7?d2_*$]Oww|wzw{8g{f? x_+[|pvraͅ<׼ ovK<3plvϺ%7Vkuz{wtyN/oIyUr]/JDo%i@4Np"]fZ(G T1~;6RV);2O jt鍓}jv>+>] I LTBidTA 8pdm+Ѻ<;]-fHmZ?eK5SC}b%3os|LMA(*:G,=Tϗ+8翅a@ckCQ5‰&&րMnޛ&}ݛL*alm \əXPr(9w^'Tm ɣlzj(t))5-);kAUri\jQr98b5z4?Ύ{gˤA_GygyۋWo{BnS:/YcOIR8 .!x6sS5F={#ƄՍUȊB&lQ(al{KXTÆ݃oY ۓmSc/ֿ東vn).6sd<% v|dyxmj f~@< l>B9#]MB0MƈwXt4Ľ#VO<;9:mXKzlOџqU)ոfHvҔMzjf'n^LJ ?O?."rmYb~jG$5jpa5oN yLj%q*@=Lف aݖf< YG@1`HrMdjFBb-BHMON0h I83܎3+tLߡ]܆F;(GiN^,""!D`bi=UV)))HhJC.Y)7.v1srvгzfym 9.Z -C 8xfx^_ 5&XC2%Sv hND 5ʖCqa^ua}v -೻ġm.R4:;\>A|iw^tz`ɛ W`.11QbƆ 02F `-PUlt }n3ORզ.ޥ|SLJOs y|{`Gӕܙ<񑇴TYIlx@;itu1=+ܿK3Y*upwX2FG F_ޗ_^=R40/U6&k"qJ~H F.L`3FJ*euf8'׾}Uy_sހ x"k}& vc908On{N)z>C30Ϡ7n:{ӰemܷA5 nu pWx] zaBM?".ZӬ].*WJ81lzK),W7NcKa{?_~???kO4. ;}?}7+>Dmĩ࿿^=˖))[4Tw{nxN'W C}xK(M΋>E_ _zFK/uRM3I}ɽI}uR%ƹu: &|[v)" L1h=(yMLX0OB usU0jĐ( ؼƼ!Y|c.^Jlj.Y.\c%^rbW$CblMپATkWT)*.3%a! m]0VO\X9~wGAqtڮ߈l%gZ |4vTc}jWIaRx'\53yybʑj~n`fFaFcSn1[ #2ěk*m)fzmJy{Y[Y.0Iͅg:#TYpͷ?yf6w?O>o2c.VJ\ٻɢS )v}1Bh3zLq@C-h`]d?;Lﵨ.bdNrᐫL!;&3g"_%ƶZB٦QbC `+e ālִ*hҺ.!EW1fΎV5j aeGӪ/Yٴ:[ rK]" r.1C`Wr :;#=|per--/ΧPkքLzbh'4&(:Rf/95zXl`ɷcp&rp0-J.CCJ:tDH͡AF <*x֨lb]34[)Iʓ94xdpv$wB>SHiNopO*ZUjBBVI,ڥTsE=Yc%Dႜ64*z>]-X1ı22:vp;;[dR*-U̇y ~ٗavrߊn\}# unt[LmbQ[ŨWZTL{v/YPT޾f+QD' *HHX)M.-lY `5Kl0f6DJ,6bWTl-dl2 g}a\eb! BU'/>vfABym,_0壣G*UR6femLQb\d[Rg>Aѷ{:㙝w6}wpTilg-H%eN=l8OZ/=:̨=3اCdB[=9c Qr$Rǀdb!@Jx,e jڠQt5Ut@NQ1J?\[hP-`<6xWǞ z3"Έi[‹ѰfLMbj-$-Y$X p.elLgL-Wu$I`)-,2I3;6f>uvk ΤՆhԤ>}jB5&:慍XmFwu S lK˦(9rjig~$@9Kx@I6'2YL8V^TUԿj4GܸGܸiFBR+k+FSZB\,%b2it v)WB%'H՛bŌU +SHu0 18 d-p zCO4l8;!)Ql _|\ӋxYlwIШCRF1bS.2h$Iey֜gӲZZ'؜@岫9)d-:ȘeλDƪ!Wl92>Z``J,m Ύ~7[5FP𾨝U@A65B!U`]6MfwO@N?A) LrTB -0Dwq5D 3F>o-R1;tc05.cPө]hͤTWWǹ\|<ݗa6qϔ^3XZx;e6DZ'6dR4Ƙ9[?"!|:&-d6nEDL˧7!L7X?JxH ^f Xzf/\XY@[ũg{HWiݽ~ EWhwFYt5'X sn@ D1\uqNNyD-玞= 0WƲr알ѷxzQՙQ)" O/?Xv $ 76y&c9Ίך)v,4#0-hO2:/aN(EKf-D>pޕu/قP&4c|(19:hT^"EP4tc9+ 'EJqTQʷ.:2,"1rbrX++$w4\li)A;tg VNwV0'/.m6д 7{LG0%W$\,f@ATP(U2xi23Ń(%lQJ\Lhᑂf! (- GXD8I^$f8CzH Z{1[,} *;[.r6[{+SẸm<1nJ:$ڸ9БBI}Z^NlsЈQB)e0R`D$/h{pԃGZpOI)aQ{ H#g+ҞZ(FJuVp%T8 y Ҁ؈ڃ8 \2NXYhRGD @&NS~W#mL.5lXV7*|Ԙ"YQ"D5dL@VFK#IXTabz¼-z{% MۛZi[.nOଠ-g go4#*q'Gq}|y̾ kb%C 4Di5C$ck~A=b,`%S0g8pp2†&%lY=H`aqXC-+#RVI 1OO;yʂtc&PfTetۨQ1p'%Ff^^>.Ż_}"ǯ)PzEY>fSn'<3z>ϔA3F/e$G/JD(Prz7G## N'Q֓nWKzAP " 5ЉJa- I& F^yP#- l1ڻ6ky5a MҎ_ok{G'p΂&=: Jh⽶2(qJ"Wl,DK/'@/`:b@`̘CH"L1@" "Di#7b1` D)ɻ-bFOSu3vSa <ɋ< tVRJN@ T{VP"I! ʑ*f dauXYWEn?r6u/P[B4 2^!֧@&4^;xC,JgqD"dt ctCfe<(VI4Bcx61~~/w" Y qur zG@ۏy,8 n6u汲'*q:9+NU͖nǓp'ӳ0Ѥo_%,R+7@N7V56Oů _~R-_u݋ޤ)ƟǚHXc#ꖈ9N.쁛]ydiOnK<X-%|m`JXW@|h9c'UNiک%Yl= vasa\g,diun`2ܦ5ͨ ۴:.X/!]"I@]vի_o>(ۧm;,HU5 gّ)S^j4AVZJwPcj঻kbjCevӷ&1InR/t [k!JΥT9oFǨAM܄D"1D_f3>LwGYS[,RXÜPT`%:aR+0JQ\k"Ȅ^ezix\Ulk)=~r~Lpe v (%3 SA`s'rz){nDd}no07\ nލ;plc茷Xa_XT%O2q~G@r<{T9)Yw$i֒pS7+.sgFe53DZk>-;Яb0Y(jZ)rHR|HMLc1>%@'^qEmZ won]SͿƧɗqH?w^-IbeWËy{C)h4ltߚyېli]Xv{3!-8xM._3]:m)UHfK+/#69/fFNYc\]p c>$~_x|͘#P56,+#U(hZL4iZ+:OqJǴC^&bE {'D)AEKG 4]k+Xp?ZA~`&OCўg DЁb7.pS ا"Y v(` 0ESU̎0enSd! H ˥CF'P+;OGiŧcb|R NUqnj!RZ fgG% S)ʁ۪49vZp4|}0=Qu&&?+}jKMtGu6o>/]Nmb@6'kb8'R}=KDx2bVQeu$[b|}Kg]͐f(,/>'|lb?]]9<mm[wծjJ@Hja2u*X$14}]Rk_MeqpXqt>|>?;}8z#Ly;q#0 .]EuѣIqGp鿽iۦaIyTm9{bbP-ܚ(Tt7 |}.wMЈ p%q=h'6yêpm5U?ܤJZ()]T 9ҍ  gMߩ2߸qR"1+9JGPWDxeBgvH~vyȌ4`C`㠑WƲ&e݃,غdJ,Ji@2*2NȊ#1$=j}ĻA>GQN;$CBE7S !OGwodBPCIui6&#mWmMvy4r/O"e:K_,!S|E 0ώK&,PTk" " Hy̌4{Ho3,'gӪ3_oJ;r=@]:a UoH]쪺0Gg*;s8:^W A Ў:azTwe~WG1PG1jQl|jHJ&[$s(CDYJ[ IkrbJ`${$.iDF9VR)Px¨Da(en̜k=%yI׵;}CmؽR[L{?\<^ίG-)&mH1lN1G矗l3k3ѣd!Xod ߻9<bDp cTpk#Z| oYʿT%ɔSY$1();SdPw!L! 2qj>.yf3s+\Ԗ5@Y1_9/g_|w^OT|g¾Fϼ$GUE姷>o#h~8?:^v2lOR8b6ׄ,~dp9^U[ӓ|kq^׈]?o ˱2ʟ膷y NLH-~ʟp,ܞSnxfwGggIzYWDݧ}?y%CP B"`3e*G b2dAQ-%a.ԒDN fsЇBN W+u'ϑ #AWs`4'#Jg:\=C2+S\Us=T+?ypevWӱ3K9V fg~Ò71466 {O? 9yK (5Jf D֪*@_?w;x_8JZyXv:$lo?OWx[}_/D^ym@W/[~瞷&甈-.`) IT PJ9rֹxQrz;kR]+:M3?>x/{3bb=jh] u5кZjhAsWCjh}w5кZWCjh] u5кZWCjh] u5кZWCjhO[-kp5X Ws`8jj%|g/ \u.xco]7p Je]T7v.xco⍝b.yfo]7v.x4*I+,4z~j'gʤdUa4DMN%  |n̩[&tZ&tk\csK# 4x픍dsq RX")mW. 9& )eU' )'9έ̹QV RjRQɷ]^)U١`q|uM9S%Qڈӣ [PXrDQh3&x4 ^)t"KAbڭmH>8^L m5v ÐA#Voy*ʺO|BMB8EKƩў {s 26ӳSf#dKw쿑>H|EU8:@02vy ]YF"Q3N =dO&æki^ցu !9ـEG9jP[cHw2$4DOmuI}nqOAo&VI>S -fõS鴰lW,l0w'Qu7WGgu-%8>_dCtfPDz/aDDu^ Urm_DAڇXQ(UJT|k 1BI)Oە_KR{-Un=;O|XgE .C>'TEl$KN{Ԕİ9J S'%S9P].1h*xζږ![R*!HeȱωJos2h/%g~[#c3sGv\6badm:ns;P-U$ˍxq_0OGۖ%e͔-fm!36&j>d;Y ;v֤=1yQ^*l?|j:UMml4*V(l-\g9-~ʜ;+!TPc!G2i+ 5^(yRwd L҂jJ(j2֤(&Q`l}Šf܏cP_GT"nl#sADYp*2h9Ȕ>!̠#G@@mczv>!3+uPۗDRd#cfX{۽Oq+sGOfՁqq1kJkFɶqA SZ8p*27qa@m :=DŽ"&fNBh%՘$ڎSfc*xes 9[:|^Q?r {xicIp=qoُ /56-%e>TM 1{9Sc5vU6kB.oF?_?G; 500l~qy>[Q|z@N1~ +[鍰ydhtCB,%Y4+{+.YَDbJ&h7F(FGFEHre&ġ_:+)7_%ǍcP{5s,-1\yi7v|>t=;Zd3'^O{xua݇y߷Ϋ9x{ȗ;Uyxqֻy\&c~Ph]ʜ dĿl h\3FFbV ?nිV[&}z߮!߷wrhO1cGnʫ>-P.u Qkd0ur~ɑ3tVzB$#'L $ e*_⚒#ihSH2)ᘱ)']b8Ph2/IH9Eܭ{53~rĘ8Ob- N'uʆlH._9B/ȓ¦bW2s#5@!b]Phy`kwm+DٌA-Wwg!c8X(~7NJ>s#.x|knD824OSG8C)4gṋ5-d\I =D;ZIє(*c*j1LM#(fgrђBʾdi)RgzX_!,2V߻d`wlbTˊiRl+AVI5"E5-lNz8$dᒀ9{Fn-bHoN G85ϗ/vyd e܅71[K>g51`)z@H% l("dG,s0T[0YrP!t :1@ )@IIA W&3UIu1fq7`2W_ccqC@ab,_ȲҢlnY϶o}0; C;hk2EXc'iEюNH89;?3|:q[~A{9%hgO̐Z]\."DH.`Rf32n>+oIن?U[͸-˷E  WZJa1_wY;J-I̒QLwtjprIEC! 39DHx `x:9KiL Pr!Iډ`  v~ h19a 9N|zGN7І;Ow :j-$ xba"(h5fVxUHh+LH$dJLr$jIgW$kmQ4<4|iA _=焮I ZT_G!HƉ0 9Thp" 2c͸c}df -xv_8T0'֖EMtD6K-xeϧ zV߸&)l`-RT1+J2 4˶_u?0ŧ}ZQ(Q}lÛ#y! P`sܿz>j~yY{Tɿ7uvӋt3XzRwZEbVwXP}SW<^ ᰆ8M*E yzb$?pU9[۹oWSgR<Zhn}FۧO+E՛:}g|[YZU3;{/XH-BWA_,5V;tq5~/- =XЇnee=eIz1DY_k.>lQY3L!}=>{[ wm}8D^'>P(k(m ~(3KCpJjnqy]\-@G̒қ#6߶BwG7;{n69#):G=xLޏy3V\mujN͞5z7SUC\x+?g(մϳOl66]l{^n=ݪkscI!G: 2wt D;质<-ofiS1JB2`C0%yX3*(e"Y`c'c J1i^ο!U8Fw pSSu8_P}^'Bgo 6_xʷWFT!,.(jһy{t4kG_{ǥvSYTzՊ/zʜLY)Wwȹ[ꇕ'o֎5KT%b)=ġwycWgi|Wfn܄p/Ӻ~#o؟;NVɛ^_67ؼesu-WdtW4),&<ԭw䆵-?d?ꝲh]E8FH ~NWnjxם:v*!n2ƏSb'yJOF!4۵+tϨ~]Ǯ zmcx7O":n~CO%ͯW)&yy &W<",3'KA}Keկf#>ҍmt·m|a3pДthrk-ksvoFΞ%+MtZ\/:2@7uW![Hs} wmrQDʝg3쎩ZxW|\{D>JڪP %G JEo2Zn5a>~)OLڝv{g3"3Cm6>~(LUTfH @d):Y=e %+-SdxO %/sq 6yl#S"",3 *&_".VI o`D J^iPH$&/T1 (KI F@"윮5fg=_`MѳG6jp"Ywk|E,U 1G4R(B[ea>QcUݳoer? Y9``H:2&Mq DZRbRV3h',~;횘DU4䔋g٦>eŪ *Sp)k]i#8Nf=q`-LmkaZbF{5HT)VDIΫT"ؠ$l 9:R8hvzA]-/zy*l|ؘ6E*J_qYF+!cu* ![ 6צmi TakdFh`69;h3'*Q<܆mD$q/ǒՙo+|J6ul4l L)Tvۣ%bglǀڣ\-r]P:\ܴ`De&VX4$WIZth9eRrqrq_agc_C@uXEBg~Q?("PJy4O]1NΘR-=Wz%0+e}KYYLJ PF9DB:8a{ !ȄUQEmΝfrcHʌUW6ƞ̫0U>mv.S'/N3]٪{tLHJKiegEI8Ґ\/=gT0VyU2O挤"g\ 9@ L.Xֈ:@!5[ |PjrD{9>OfMjI8Elnf鲫Qz1'N]fg:*lYӍZƠT>j y1VĕT+/ gku$PmJCżWԵ✷BU2mUQO5 ̮aj6 )m, emXBИotE9W;6϶fw3]lWs2AMd_$;nY%3lƼM9mb< -f{n]+r& @xc5K}bYJ[L&O4cQY`h[k򐴔Zȉ뽦ΦdјGCANY=ۯV稛:8> ë44'Kw"(-9t{ 'jlaRHks{׉;0r)xhkq)GzJۧmm+׼e>=g%54( ѦU"av.};}YtGYLTh$J ^H|2 njchGǽw*%OK6m I6"#h=rrm4DJr|1D@-#dHѮtEvF3 Yu$qctQJf)iė0OC[}ڇJh]|YAz ~uT%t:,&C"x s 4Hʦ %r[(4h/i(c$r ;\ }ԖhIRggjMspJJZ(5hϊ*֘%(":'#WP $:j RU MI~k248ʅmCIk$Q@MrR ytHf42p K2R[h|0_U]Ccc+3i$H(hDjmU&3ǘ&NV ]VinHo=$%]w #u?]2~ێH~/c(j(V2C1dcslUjAQ몿_briφjԷoŇ_8U`6`ސy4v|ֻtي]+xvSH뙐"&ws/V^.1k3QS-&LEJ3vz>o☕rH0")?N$cQFE;:ZJ4-e9*{9];)O-A1L$ϴ@N&s n8Tv_g9aZBktvDH~' X4MPOw8%ͬw"uJ'ŝȘF|k>Yv/9OJ["p [e1C􆰠m⸑,g4T@# ` l2DQ4M q@ !C%vpFqÁԏ.fs0el۠.KQ˸Kr%0VnaA0WM\M\X8\;58UBjQw5URlQIPBlN \sji$YNm QOXSӺgrQR?P=|~|~$!KbN8Gr{+Αb<|>q-wwtHB>i0ut0#2mOu'bgWcO uʩ[GedI6WKI%n'Ӑ>A1IEz@Im9tZ:U'Uw6׏0<;G?ӯ!e?;8CG0m$Xh~y/YWCxwZu9f0.%]NeèZʘʭO_:kq:-WZ-a\Y\c_)aO9BEEՏשRbE=X>R%!UeZ7Gh^6n]5۵rAdNFjudL(I"{K.Eƕi0/TFHOlXѽY5m|4*VuJ_^npt飡h۝vŋ nQ=z&Hs~%jRCb=˄wcqB~Ite]oWoweqH~dh_gv{ˢ ƲJ?::DYUVa%d|CyB,~Vex~ k)4x q<P d?wOR6ϕP?) q*SUb9nl?) !e}DvthKN#v r\fi="~&u3@Wl3N?w6XgT͔!a̹D< | |K-f^Ydm ijk6`r-WBWWRrVBټ >bV6qR*jcs11wc[􌩝yf7q.R=#obRYg7+7!V]օw.|/Gڃfzܲ%UC+YW^%cnaV+㝍;}>-Κ+ȿoO&YL^pY|J>㚬* i1nϊ L|4'WM\rre#|rդ\;5\+z?\Cmgd.rz&M,kLEtd}\7.)nJ ,褥GhZ'm=;->|.&W:::JO?~ M`G֊MZC6)ZO UhʶmI},pդ5֤\qcXg<|r@nטc&ՇWMJG#\@"dOfY)btrmZřKog ֕)~򟫱@U|<[Ws9NkݫG VY2Xx`_E6ipN_ӟYݞY6$~.#xur +)3kHU%G%\tuIoyJPվU5 -Wztg^ ~t2\<9sO%.+Y4}sqCfOG?콷^Ej[G?담8 }KmӇ n0I' Zْn<^vng S?/j\nO='/>YNyyYn'T49mD -o;3b-mPCQXLC֑u?[!CPe_)KypdeB_XMȠCLc>eϸ AsHzؗot3rx!uu3׈ٮwJc[M ?z=ToͶQDx~avulnޛ!=yuu+{E|}nh5 Z0!Vig[VouQ-J\gBOp>tx%圭Rz@e>c\DUh=Q)9x@ 2XlR#5YNtB+_jA[0%%j]X*%lmjR@9Y!+TY6:ucc핍}H$7?Q`cɤh|YUs C+\AANkhr1*Euh@TlF~KIQ9Qhs$:+j7q+jys5t+Zxv> irP t'{ѝ = 4K龦@goƙqWNL?rqmKOLS~,;r]Qy{i9#k`4-Af*X`֎UZ9JQlR}❧so>W#D`I-QP3A)AUꍌَJ7,63B}XřOt}>] p~>'ˆM5-b(P#-&D-V fX{ZMEy YQ`8Jc ͙Z>,h s><%vĎbb jw{#}z<ʆV-|yaeI\2=z=.!MJEu. vz?<| [[wss7Np8k{e'~)z  ;?`Mu![%)2n/qc/ N9{t(a ;[ʳ om.Uk(򗥮>w],[^»wݏ Qi8uݤ.C#Bn"|8xJi=H[azs8/[:i8?Ci9HnzzmE=<& -nNq;]N!wss('< /Y.ۖoM߇66_Ҭ~/4}{}f3͟m.ڜaGX_JU[C2ŕ\[Oiͪ٬|@6+wYJߜZ_DA9" 5%|,֡W'NdJT@ݚQ]huja) Z.-oEE$PRoGn$z@{Y_ΪH 6A knz畏ʣU9[sbMnv; jvl|(tSL| *"BmkIVYHX2'MDan;:KӯOG=;Τt)1\% V\:\LĔZݎš'gj0v J&rQێ-YB`\JZ!R;iuֳng3+Pe U. ]E2yYAbe jn89GFUB,Yyy*k {w8q 3VkDu HTJ4h'=/;/ƻ䴋ZkEQGlSL^Gk)e-KAX+d9+}cZ?\Ko9K/tNx;eåPtQ-I'TՑN5z5[nK֙e9Q:ʨmvDDWE~|lG?W:e$|fJT 2N[6![mn\ Q~b{z3~GY =#|mCiڪ- %[oT8BrQքP2ȍy]7I#=όG9QQ@@'zt'Z읐C y nJFYR5:+A&"E-]e>݅ėzxSE"IrL2@sr3(f] N)Xk.U{^}+)d-TQF9RedUW>=G[\z^_>0^MޟL`.YLN.}>(ל^p^x6\󹿋)F/΢ M7i~n8;2iV`9:MNϜ|LV(7*K. ::wԞ6_;uoǎdmPL~AU9Y]tkxJP!S 2rfz 2ljmvTٻF$UJ;3ZάVӳZn[(mc3n"lc ƤhʕEddiR6&m,+&k6Eo&]JC|P+!u-KA)^zq>]Y?,}{CÄԺYL}Q6l+2Wn.竷l'EH=Iʩ+%6b.AOI?V 4/dC^\ mK׼ -1 39B7c(>y7{ݥma2frЛ"3Fl$L:$OyŨe VFiV95Q r_τ͘w,[2f<^о0?^;e= >LD 'td>ŖzGw+]=Gw? Q!”@4D\!ij!%6jEfȞx;-" Di}g+ 8rVxe84`w)pR9P&yό`JL^WO^Y*6ex.D.KZŅ{/^p͔RNeq;A♒ZI *I+#;n^qRNR,\[Q` v)!Qsʺ,+\l`tO^tw Dl >Jz P"˄)9(ǨX*u:Rgm.B=.  4!!Z\EL WȬ30+FdžnPsBu?sV7PjpzٹgTOFy*ւ[.á{c !Jtw$I2DL D%/o7)*8h~9hKGv2Pԟpt(i!S:o9rG.'߿Q2wK, T"Eh(S#eMag["g_[lu) M6(퇈*E؉xoN]cm] x/ χ右/7GYF6Lui8E7j_~< _f5[]43` clzv6J(*|f¹jx9z1~A<1ѝsK:h\I pF\vZQ4/ -rkˁ :0pIEkJ:@ d p+iEpĸlQb :O}O=Qޘz 6\ͷu.ob_yS`cwtXC9!ibdwC4Ē>pE"N'Ǔk\혷NWqٜZ|vkap?f .9eV<`l͹g#qguz|8ǿoCCEvĠ# >$'/d{\7~bt&yGpD+TrrMBqI‚ʸH#82' dG~ލ+r-oP4&AY<"`iFA;L0VE~qA6։`XNXaL8(Uȅ5Brd<3)ZJEJ.V9x%zh8u%^,>=),ɲD<"EqFz£5v7KYM FԆhS\N*.:hwϵ҅石>LA$IJe>띰 @r.Gt!JѠԧ@$VP:WEK@}$g1HIa,ic`4,E@/Ǩ&w~fi%oEZZ/Q;!qp0ADJT1F29605Z72FgMpg q˨Bd4E4Q^s)r6t܆ %AEٛ%e)":+WP$X E̪$SLsw{&6BB=̐X#X$)1g;&%b  ` T[^/*fa㫁xlxA&oI yqH2&DI`Tc{csTľ"?xռ]=uh+pugr )Y }qϏa,58`%<)Q_՜sWTh>gB.1Q(=ʅ46EG/߯]-Ub~ۛYOie?42Ū?: T_⠚sYe~EbD氪?HAMտn۹`u&+Mw_oE~Zo̞|0,GS|}-v]PK1m i!b8j x2:jZh(Jj-MvڪȠ!s类o{[؎vkeIxDt,;+Q)I`LeN -Wpωɩylp2c#yكy{ 5i<5Tq, ybVd$SMblRZl礛_pB.޾r!D vW  KeӂEg«Zjv( !S 8jT[YC P']X`1v ` taQ&1I"33)?P+u! DE\6YMr/PrNsi?͎e LrfKT: 谉n)Hih ^ŒQ\-Hi#M9CNjf$Ǧ#Z6L7 Ec*hg'"!*.3h"8cH7bHm?7(ynfݨLyT:)nE )jTg%h+s4֦b. uYN-EI"٠3y A;gyNMi/E0~oHKA[*kZAjEDHDH,GD#p EI%VKaRn^.ȇq`VY??Iϒw vqtnٞ܆8T| ۓ i͎Gh֚O[ 0HՖZoSƕO7RbA=Ј> \KBlMi-Htyawn#e6hIRcBIWpZ,Wa*œPZSHg#=a&{~Gڞg`hTz.x4yOQ308j!Dmdq")WEY-S(붧:]wz~oVua;hx}S&JE=]KT;%ՉpNB Zì(ib_پ6=W#޽w|,|zkTi B*A6TJ"ܵQH筍@rFNM>e!DJbs>'-AQ )Ss/HNV)T%QKψŊ1>14~34t}bܿS^Яd?<}gA8 ne}}*x|t1b]YĔ `؋3B8S"T,"fsPqFq?^ OEu9@mX7_B'f!Eb O!^kYp9͚v\szl6gU8 Nᇃx˸߮S& }ԏ~2lV(zg#f%w?Ï hx^ ݿSn,_џ6OX(i;}8QϏ`t '~AQB(}1ZnN} ha3W/bHNJ'4D+Cy}( KCcRED%$ʍp%g`EN/~饸p^LZkTeVȐ`M$)3Rգr``2E/4Vru&2\ 1޽I(:ݜsg6ώ>P#*>HJZ[)\̲EXI2ʕd)[Ԓdf) rFh$0•,I#I&Fmr@I+gِd"{i;*ޗA kGR DA sm^PIr>$Uw_^ sHrdt+"U"&pRrٻFr+ ?]Mܑ8x HܦH5ɾ;Q$%J"Kxfb$ DɬIF*"\?5Ԋ%?&@D`&6EQu hsFrf&+%lB0 ѕ:;!t)eD⚅d!j9ӞM디gHDNxN1p iжz[I/ -R|mx{ԄZO;Kp8x4ddIv*gBTzS{0KюjrvpVW;O~y1_qx|GԤi4&d%}0亟'ם *h:dIڐȫUJN:(ҮƏFUל!$r"sM5YhA{N\FNz:UXv|2j)Jɡ3(Q)xB@0F,+ԫWtJ;){OzW~܉)toJj0KY?m;,6ȣݲHۢz.iިTTuhWљFFyr{E2 &܀^붒\J>y%Cnxl%Q!27Fps4+%J6I]DV"9o$w{Jo?A,y'?r¡7 cFW>'tyx\~x~*#e( ۫ *l.GL #Ϡ%J07\Ġ Y8")FgdCO,P q!:}It0%e206tv kѽOv#k2cI2f,eBt2'Wa$R[IrOD!JWUD‰RҀb$y Ǥ)RKr`-XYQ GEK ?Ͽ_T F5990CTfo ;R!ȴ>Yݞ([Ll=8l;{[qqZX1udR88Ő6餝p BaAp}!Оmyؖs7%RbvAO9+on\+O*ֶՆ2*la58VN<4F7ž~Denw"Y]l߀O[|_dW ?̾NƋ;6G)L [dj P!1Y%Gʊ*i뾍=yL (fS VTdZ!YPmDa6-Rv58jK6XBpQZl΂Pr襌,#".ґ!҂j6֓{9!WERdk0рD GQ5HNujo[~%cL0^E&b%,`w=SL 9{v"R6TIH|7NZI\U$ʀ!J&nesr^m gEzhEVKuVCl`KfN9 1rIhH4Y;%+5rmx x)t\=<&habn=GſU{(ڄIpCy%ÑCߦ0cnLAlTCOxP9ס\Lw@Ko*{%hPw'&Ak$,X;=JvY>-tm,uz fΡ-J=h_uP n_Z7ֻ9]~"̝OZ6rCw~գ策a>osu9^!sѧY[.O5Ri_绥66to.ڡY\s|3\8kPmJ%Ԧ,=vz)i\KMYF`M֦)g+\Q:5J(pvY6j` EDHvUBT*UʒCD&L5+< ̽}+X/dF;X6经* AƝU\$9W7Ks4Qٔr9qV M/^Ln*a"RnJBF?&Ъ&ɦneRMAuyrN&@CNR;`@t&TQt%T>5XE(0 Opc"rTd2|y=9$ (aβLM]oy8 4`,3>T˂|qnv]Zl{Do-ĆXk ceCV!PbEv)1Y4R'N{9D>:g6_`z:M`yޛX#X'E%:lI"뀠Adqh+Bl6X*u'E_: &91:xZљMY:rHZ E;}z<(մ!>(}>: ԋ@QmnpCsR #^x.š{睛q f v)X@EN x R2$ɶ AEg6!DY,ƼVë`CL{,Xc ey\SͯM՟Wy>*UqRh<ͳm/Fٗo/+0i/|1JڋB])NV.?;ݒg{XR]}3ylJ\uMP7ԛcl z>$\϶ (q$6a/|DFj5mEV{QrUMNuMC'_+NlQmR¡<}-mf]֚'qhriҩz@Γ=V6f͋jI=8ׯs-ۚfkl]Yu _xy]4T?uqR~9t_tr%>PV5U`pEG}{魲gq^>7a<ղ{{ZV>1Xy>tmGVxh_ \įv-qoQoKb}x,+f\Ex8AfA͑n{i;ԥb>L7hкᑇ&0w<^=(;6Oy=ۺXɚDA'{OM/lD d?4q :|J@ɘ,ѐpR7HpYܻGd 7V *<yhKNP!K'0Nm𛱒AƚU4 O ["AEB@ipoǪhzۯUgڠ{46[^'~nBJVr1_=Cu[AG<#I+ z%) Izp6u&&ZLՆ<R(ھTW=1z{s6W tFI~'ٻ9a3M2~4thQF~|)TG)k%ɾ6 eLX+98y3e6[.(#'xr`.CUD\` ^ Ī#8/!*!COwCSc>]5f˼r;9<̳ >a 9Zl \Ay(f=ix t9/?1ോf~|מ%L*5bڣ2q"-byݛ˛Jxwt_.w9:gJ{8_K6@Fꮾ퍃MBW4m%gx.%j"ɾx2YU>*!i4Xfx c&Y뺍%y&4Z) +<5ΜFYc"AE@24CW:- _ SSRRN!$i)":+PIh"NbV6q% ]Pg733ՂHJ;OuR8 *:K@4h `TYgč ]7_5Q6z%9:"#yŐ@bpa`&p52X<GۊkϺ)!'dkAH@* ;pz71dΨəj `Noy㷧 "1~֛0OWϹ(7]EpzY7f'$a8ExgB<7_lg$|3u MWL|!+7墋*3o{K|vͬ> p#njZV&j|Vpziu{ k"~]Kwp~*SW%ϯ.X-CR\ YC( /#O(CljVO&7R?{RYu:8 ? Vs(nt9|Ov𪵽w>:X0?dmvCȒ=(rY C0ΑV ]6ro}LR -ɚ'j$QB6j]A(Ew^h ^ *QQRKMkM!Ü!8a ,Gߓפr&ƄHyY(RLs u,qgӹ%t}NB!QM΍=΄n3ɺEG19ѬB#>'mz[Ge: )3KkN"@l݆p&#[3Ͷ{؆p沁j<:!ei* -NKL<#:i F0u3s !V; Tx㩎F&ꀉ` r'ZZn@b,u3;-qryOɇ[Z2ˏN U*r]?'vUPYSa0in<~_UXZ' +'ރ#,Ri)G3S;L@z $JP $vch#uNa"N4W RIEK:rg۬lp6%j[t`(}'BrAm Kd1JJew)siZ2Q&vQ+|ݯST:wJ$jf䵣Dpd8oYI\6=k$8HN$[틙Y&1(SUJ0D/JjjjQ hÅ7Z儭HEF Hc0c9q0Iic<6ކHH,J$<I㩹d E EY ? o_By1ܼwF}Zi^,C'G4Kʅ*za;2}!–WHkZPyDH26q'aRL A(O2Aiquפ0d8[V'ǯ3f~jؗ=toJW lW~x[A<[5Xv՜P81iH_>8' Ygߥ &@ evO!b1Fߵqg oOfHUG'$:N(OЌzڤf 2Q~? ma?n=dz6ByGsl/f-W_ܻM_)iJٛǻV!gyir̗68B4z_[U|6._'oߞN/\f'Ґd>ϽAڽefKΑj0x:i^"7/8!FRG-êa`0 a-2ma#p.f->f=^Л\8zW Qg\5ꪵj䂒4x$,|=5qbcj^>jv͗J_/ Ǖ!9~Wo^~{|}_/޾5W)~Yoz \?vhƣToJ3;G-7tFN;ר[QyởAMPj8R d j`vɯ':,(lG$UGTx2r)kJ6N=yBHדI?u[#wN Q^k" jEI27^p?r,g,oxm.8 2PH6eݛ%q@Xk{LPCzB]2p°do{<]OKΖ.y.ϰK)pmM.skZQ##ɬ&~zC"'U !z>BC={5\Ysdu5~H<IQp+u*P $녳D+Z%근 'Wm '`uگZOH(_M1s1hq腵7K` 6I(<]x2 :lYr"~ ZA 3TU-7.,.9+f Myߞ IxS\uw{?aJ u囍䕝؆Pփl+GL)a>m:qv|N>znWmlk?m>$PPs. ia]<~]n 5*oqέa1OiZ C&*&:U.@Y2jB,&FAv;獵2|QcWQY)eij)UHRDtVӘHF(YДvs^ׂ̢HJ;O59]ށӠZ'|l* !gyo ϝ+ɁAuOL+" 3 0 Ƕr\γˮ5ݪ.hk^jgP~i i{vJڗv%!?tҺ_WQ+9j7]Ov%eB=ee[#!- ܐʭr+J+ʭr+ʭ.J+B[iVZVn[iVZVn[iVZVn[iVZVn[iVtJ+ʭr+a;c8t(ҥcGQKǎұt(j H :Ta*Hvv3l>u\&hM'j$Q6 j#zUHKoQ!-Qz\{ 4DCI-5I$RVdG>DE0|NXQ5ƄմTRE 'TQԡX'pN_PRc[gӹ>'ȋ W2ٗfyTb76w Ԏ3[LKoVP$t^b>Ĥ`hvĭY62LaR͙5INYYkT 6ZbK LFgn_&!٢LB83 WNS f4) \KL<#znlp>L_\KM *TG#uDn--,u3;8s*]{̓|}\Zυ?z!ՀjI]@Us=T=`ub ^}$ 9^oouZ  ɦ.7?GB3tQ#g}}2~>L62f/7pp,\esP\$څh 0{H,RDkm"7bxGplmžrvQϏAf}ݕY m> FnwʩIW1@C.uN] d@ 3ҴʡYg,CTB6mO$rUo0%%%L۲+Q(_4qx<1$Y`A:QLMШ|r@F,R,ֆ Spe-Ȫ6NXO1Rv~ȃњ0D JK)j8ኡǴ9[5Q:`݆JތEʁ:)~;$ Q2E):yzt7JVZVɰM %Z)9NsN)KE"",; E$E@]6`DzAʡl$3yBT` PuuZc?k6=TLӔ*>ݟg|E,UL1G4R(B["کDߋz~m'QAY9``H:2&Mq DZRbRZuV1Nz]vaDU4䔋IQ:L6<9ZSV<DeJf4z'ꛃvu8^[o(i8΁pYgcJLi5Eg0(ם^<*r>Qsށ69/UN9s}]~PsPYww&m1ÏTgzB*W]}yg&S{dϻ &LWu蒈G^^zPr rV4ZZ(VS- ٩ZR%9LuM31%C>aLyڮlJQ蘄0]SO'|t y9I+>+6H9g+鴔3RS!FP]4ȑ7%(9[`D!DǜB(!}dJ jlcbl8{*lcR4'%S}C OLQCa*$3.&llXE]j);ؓIT6uTb@%E &4le)SPI&2+m5S轀Q g䋳y♖20vT) 8;PȜr/5 zmD&z>ΆS]g-֦= [ qXsd-AZvhDdO,*LiLVeJ:(sm׶sQԗ D sr );tҚ*FfٍJ3,lfbadCF, oFTseH7j]Lf_&/ˆKҊ{ɤPì-eD-Dfɋ RC~..oC BVBǨOĊ  A̩ۈVٍq6!池v38Q{d/d^_b> zH.XD6y0"QDcd LҢj:8. 6cšnv-S?GQsm4>q$1uQj3g?ƔN9 ': :SR颦e+6K6c  q9"2S`\wOJ FJ@BTcc4c%`ekg宫ivG׫YxqAnڢF%qPkaSoxuul{ŎNoz7e-{w>Wկyv{f[[ݾ3C@fӍϬq[nLs} X J6/Vrſ)d#16\K/ pB4NgWA}dL0F..:cR[ghrgO3 Y$ơ h >iJ*[5dLUi\ d VlIA7 :' T[i5] H݃fm͵^lOj&$֪6Vʠ~QS;k99ؑ7Tʾuڿu+wQ+wXQ{e+&O6M5]/t'n?^]O y(euO?en#|u1ԢГE8z3&>My?3]N/-; |/̦4x%>;7Yϲ5Eso{no'*]j[Y;5CP_~e2w9Q 3SI(1=6 ϶r>iw=`5CDMaUO6 :뮃YB9>םف/KM+>,A`Zւu ĺ4"8e 6]RSYf% (jlP(}PNZPk۷֩l6 ]x4֛\X]Op{?B}Ӿ\rLq|.^u)L 5-ۙ"❈D],ɳak*yW 5dru5fG0x;,΂#v\|LU퓰C:9b{}A#sΨM銨Z8±予ҩ1I{L5I(BI0WR&`R'4e1N )TFEcjbvV -!f EgH\!4E4I%f3I{C8x$#y>_\y:.|:_`/I/jb`S&J8QDivY` a2(_j[!8%Bdt !H6X"X )(Jt`*Ii6(z`.CsoYS?S,s1|z!f F\~iGmQ&w(#UvC0vBV!6Ƒ?h GG'T]Om8M&4r{>nt Z+9*3&W & X:>9b'n'wPoƲߺE{O6}+2aX&p9,ټKPC3FO*L8P \P}%6;aՖWf< G*-$!@ X.$I;\Z1!`Ώ)MR41Ga5Hb-vKr *N2I'oxc {oȢ՘yZEǤīDB[`B $yNv$Gi iy;K9x>%P~^}&Ӌ rcLΟyPssBޡ~uc9gҵlFTjCZ Un^{wcK3YC$|۳tﶝ*}DqЕM宏>e5uvUΎw[#ѹH[jeRPf U& :1g5zMU<Ͽbzq{wP8D뼋$%E4Vq9bO{XD czM@Z`*g݂>_MQVq`QM-*?E!/, mbLR"({ .w{. }W/3N>z2?J|~/#y&꣍ٛd>H`ꙛd>L\ $aj)C*GesoN^A 3kZ@_ʎ\STޝ= sd ,Bg(HmD#Ck_ +2E% b5ɶ[ֻsQ$}.%q`T9P@p:Hi 0J8Wi4P,X#=&1}e˃}:rxثWi< GlWVf=2 Z'l0^$"I1Im_.{h$&XR|L!bHT [X6g7N: b38D\7"∈C%4 v[e ZޣޖbBHze! tbO'u1uRgT9&dJɧTN'ƈL݈ TGūuQSl%jr&Z@@ vl h`JBh%iq#.Ol@+8<< lGe~ܹ"^}ŲI 1XrG~)}5sV+u[AIBk^2a=;taA5ot%(%V%xAzHdRa+58L=w*nÝ姥CVnٿUeC*]w*]Vg1+׉^gw˳ޯg9_NnQ}=ƙa1oo}tw{2;΁g\d㮫cK5׷>u+k_ow?]M6;̗Dj,-~Kw>}O2?!>."~}kOy,֤wiMev{]n~,rӒ}i2At]NT:p"wT1 mwgV֎1˟uDthbe59ZM]Tvz5Ixi|ykluʝ>4̨sٚѝǙ}ۤS(P;TI:᫙8{sa =8EaީUA/kOnQ~44n}1/жO@{'QAu<ڈ2/뤰WO"ks-宝oςG;;+V,/|$-c0l\*L:BO".FI>DAʢ$=7'Tʓs(MIA≜͋u5gO;߀[nQT> dyEAĂ\Mg*G)QeaFHY9RM÷+[Ixb0I *c &'Z`@T%d產HQt==%KfhoU6*/*:MhKYR"lP9+m:7ӉSE㲸?f,B#Q HfgVX֩T7AI^0Vˉ,t@a/o|yOk39a~{!Dmj.SɀF}/}F{#9F"WƩޞHh12vRۛiȁ!Ww5n k t;>Oɳ{6ĎԝHPu@lL&3>U`/a}/K~X=UJ+~(AIi*-_RA\{vB\QmcNҊO:1V's'xB^3h_(!R Z+T2d/)dDt9 XZ>(\-%O=8NS*t`}L/,+y۽%2}OWGfm)Ť^0NrTR$!@ NZ IVsDLɶm2PY09p?gΆ˅)pL>#e~^;5%%VxiO <} =zQ &"Fce-  mDZ{)5ن o=3"ϷH Cc{_t 2TAZʿOz=_7㳙_]DiZԺŬhquVB:b@pA~>{y54o8t>go"Q?dY yA1]m }ZF2_H[gw$ 91v}\ӢNr]úJi4©Gݧ>kleݸˏ}@L!aƕQʁq&дi1,mvC!5? 5 7TNsRHT>E'2jR}Zo_ <~P,0ӴWok>K9z=xrn[{?:L<+uo,\w4Sm#Z.Vhp:":zs^}#w33ՈuZɪ[ZuCl ():1K:Zk-3|IIy,0&rޝTifq`nݖ>7zcnhy)M$u\VsȻYnvlv}r35 ]@BAf\PEesJl@'#s =I^:#A+%GY\'i~=)E/3>Bra9.M<`t"E !={1 |K!53E"|W1A:mqdGŎsgon[cGKi9uFko*F,yrn/?S BiTM~ϦmwgVJ~zݽyЃ3#^ڎ\c7)xb8p>Z9ͳE/M@~c}Ɠn仳w"WsKZ)z;c@r0qa}c ޵q,eėJf~w#YW D}ϗc~]$~gv,``12G=~U7w wVyٙ^j ߕ7ΧrL]8+~*޾/xOIDX|YoTI!&Oz: ^F?,AӜ9vkn{CDrڡti%62nu4[mMr=)ZKfknF| h4MjQ/=X|rr)h,O%d+/p+'YmpDPyARŽDٌ,ownj;R->kAAEIe>p@rh\RT2a!ʔ 2xK|jאd<(rFcq|Z ru6ͥA %mms!RWM^6«w3%}t0~SusIw9V<1y篾 EU QE |TSf2#!^4R9 gBSH(YiCc"0 E!)e^i+MΚ l  ML2 &Bh"NRܥ 46FnP ӬX-@V(l;K,- bg=/tR»qU3.66 ة~>ݩwQ#OwrAvzp(r9h|r# Nćs8ތ\!p|fv6\;eRVQtѣͭ;];{}8 Y~Qu8{qul^[bX~l8[jq>SKeyIO+V/k fæfqu;gZaE#3k+Eɩ^8 RM&sB H+cJx['eZ )M%g$Q:eeEPl`;xK!/ ٶk-\[ǣyPZQZdey:iKX=wM Z{-vj Ot42Gָ-yr+d=ȹB־Z.oS9-ޞ踁y+Y/z⭃0^{}p0.ܹpgsƺScU#цN\32Zf&ygKQ8+VFZ'4Z!A*H}kҙ-2L|IXL:0*h3by.$DV(\&h"kEf]\Zuôg3fI!+)Tk8W䨒,$HNמ5.8HnTۧLb2xl:RK+MUZSI8KIQ29@i!Z~9ҭ`h3ͬWY uJYhYs34Z>.d̂,c6T  7&qBd9`pW&(,= tJKr]ZףcdF [?!G[2eGTAke<&a^$MB\DZ5|^c㓽8-[-,9g`{sÿ.>5J@Oh˻zͯC; 8b1䧆N =́un+`8 ǣ> F#@v%@1ޣ-=;#78M(zq`Fd9#Dg#0|r)fAn8qyNpO:9\yv?ēVBz_PN&&'+=M6?5٣HCTU)ƌ)pX~& ܘ?nFge6Mjkf_if~v aJr+0 [1cf_~gI}jttsѲԚ˚"4A͏WTT4U2.]ހ,fxIh" /GWa$ri#H WqoiTY+S&eD R)"t鮝 ++kkwm=at@3B&{EEqp⤔QݞN+{:8pn<3~ĚegA|$76N$/;8nVua;c 6!$sxFyt ޹2 F ^z%] qzg[Hwظ;l|a˝tσvx*L1H-D&UNz5f{W-BypJ&qSZ!`@jɮ>kJJm(";Ha6#khƄ 1O 'VI)D&PE ObEks Q%vȹ\HKl>k7)a>qwȝ ݉6Nɋw֍ļ' :&ay=\6\J.Lmb͹l2wla.ej u`y^"+Ѽ'-@ xfb<֏MGqQ )@#h%hB͹+f{1F_Jc3.AE=\ :Zޯgr :ٙ0 *7g~,&g,b37=-޾|J];7XFy@6F&X>#C12@*x6T҇$2>qx(*SԶLJN\=qX$y@ &pU&<*SȶL%zJ!KOF.~P=: sD"t ϿD?ƽjj֛PE0F;?S|v8p^QAeϝ*stmVXZ9Ι,O|kyYt U/vD~h4R޽^&L>ÿcцMq:W!Ac{8ӑݝs?3g4M4?]ui}Bn*98ͥj]jTVG+V=|We#+GNϢdOԌ&X=JV5߇49bLX8}{lx0:b8mhYfW2|].SO &ԙ͠:sz:_@ѧa"պlc JJ} FvNP$"%&'SN+a9.hoj9/yf$)PNRfF0L"# o+Hy˅Ja|ê$U+:F-B#Fy$ID}.Bi \E,,9n4J2E&X,2<`*z?HB,!nنJT@m!%oc$GK J1ʈs%n-)- 3V]"#E<:)Pđb@bp0j)Nh edc[cy?0`\Ic^ 0؇`^^v8क़x؎do%Fd!زݬ;RKk[ZP- m/<0+G3#] gT-vuxqv`]":o҉t[y UB U]/XBzv6ye.i BHz^oz]5D^oӿvޮz^oz]vޮz^oz]vLvޮv#Xz{4tz]C@*-]vޮz^oz]Cv|P=0{>6C) fz4XU@9,f.DE 38ry|2 Q\+ju!ziȠ)޺&h+NZC>(>Ha_Mqf%Y2d&(I,M$=-2*mr!1 oFpt]fٌ3~Х1.7уy0ub:^}!jyxM""<`c^HADDxVc]LGȇ:B.o5<lfg:|䡎9)`R*&ۚաGWd{Ouqmlm8 mT0Iy%̳`JJ`zL=$4$0Mɶ# DgD]9QTE䌮H] %M.*ۘl7Ζyk]1p#&q$>}qU$JLB,L[2iLSC%vj޷l56T 8&GO p6*}4j0OX?۱$C4';&CjWɉɷ1pE'on]5wW#9Or~lCOq\LS{ygK?˿J4H cW tAda!;2dPadHYY)*iM#NRSh+97f E@ٻ\0(R 8VZ4_֞pPSkUf3cW_}!v_W_x7znW_>Z*lN~?k0۳?;:v%iŽd bdWè-a@-DX R _j ed/d%tJ6uBBav5cTxdoh9oIj8=v8(xfcG0 {`}D/v\)0hD U" aLb LdhJ`]d)R>YT ZGdT ?l6p$>??=l~G5{#Mi Q P{tRLX/FMIG@@Lm9=zsc2%Z͠Vp.dS*0O'!Zyi6ܸi,/~QvZyPVc#$e6`-!E9 Mux,8v{pawQa+ǿM˸Fn%r9<{C%YE\~`.SA2XhJ*Cl$M9a;+^L9{jF\\BoU7GHhoLf,ULslV4,}V:_f7t=_fڡMk2Xw O0*L󯭇Yz=wA{퉽48suɀ'&gRԴ\Tr9hC^$c֯Kmz)PQgP'PL*Z)jPHV)xMga!{l'1rr#b qϧU=!4+jg^ˣtxGkjiyKui 0|<4a?B͖n|S#)gpTDY DU,23Q I+kbEir<8!.l8u͵ svz~ X6M1gvs^jBp!P bɫ"٢UW䇤R.ј9>flRV<؈5QbnN$R2J.H+ ȴ(hzQU5*)Xg3#Vh!/.֯v@R6TUy0ZF@"@f@mejdl8[`zL4/V <~vSo|7;ҹfIt[:^m@BH33I礰Wc#-.v<ޞw;{_NY*"^I[ aظTPt$E@]9njgAʢxtp+'^].4ضo)m{vKzh,LJ$&Z/l $1Cf(:{@࿊*;]J| [nzz_+e}խݛ<|~q{xձcPЮN/|`rmפ$CTLyɽbxXwnJ?@=Yg[3! ;pu1yzyUgtg|]h2ٹ$ Hh 80U(3$"`uu\w Z:R47 D1BUYF咬 rӌ^KXG/}V~k^tb}T07J\yxI׏M~5~]R[J_.&ߞ7`{0O?A1nuA+oW,ǖYcܵ]b{}ݟׇ_goeW8ua~䷑m.`qGOfd?su=9ΜY5o>n>}s_ Ɠoד^N.n/ON~3o GHWi.b?`~f-@ښ L2yHd"x h PH:xtIS2'|F%ޡ*IjLb( @A> o3(]&[ g3iJ[9֢RiH}EA\R^m\;{!2H7t覼_p3klQC_/*=D`P_QG/ КkHUl>x#dx!K/xZ*cKRxL^I3ZMI(fu真_Z8Cc9\BK-\.NQHb#pUiB bqVDW.Fxbk/UxٌaE&+PC;}roZ%Q&d{]>1M'9_w׀Ȝ3j(C@xf8NZu\ם;V93K2AdI`؂kq4e=1tV\+׿#k|ߟxwˮ씇p}ifAp PT "%36 +!! {gVmsJSA)_(Z$UIu.ql8 *=WV{U֩-3ռ3uRYJRE+ xd@A\Q#U` kX?(G#3sd⯯A0.Ak?2Cjɼ0"lD%@6`=jPfERqv?ۘ%,ڭgNGn\_|Op8n"3I̒QLwjoQё@ ʱbG؋Gs4cqCѩ@;b %@ PB9iXF%:z,غ$D<=rYQ-m$ B/yɺ_19bû^g'Ja\O5DlQĈ"YU'OVeUNûwFnJs 73hV[U̯ <dIjHf٦D$Aqn .Qd<h<}/=֐Gk#I%XnKӠ->i8C lIy* ٸdA>rV:Q 'e#'`BlО!qѕPUPz!X}gVZX}ާ 6ۖtA %^Ȯw !dG| /<h{ (OoƉLGk0mLK6&洏,ˢq)J II2cJvY:*<ݿunw?Ohv;7Gpse>dgK #޿eF~h[4mT,`.t6h~6d5Tk\'o?nh=5 .Oj6f50{@Lnox;@jהRdS;z?p]{f~44/{=&LKi-++,,Is?iޫL`l?eFhή13v-sF:Z%*r%VIyLMo~H/DB&W7kFzJ_u{LjEs,Ox95riW̽rD(Uװ_?w~nҴ(_A4+k)ޯ?HzE!Yz9H0]ߥ<<\p5o C&pL@+LY~_4^?Mx O?ozq >]gF2?BKF'O&!*2אT(A;}}; FRD}5'!xJ݃HwD6mu Y HD4dc^nÎ]sbt7 g-@kxNA0cE*{6QtsֆGp%~Q45ZXvms;}0л)^s`38*јr L]5c#N»W - kqY=ž^*WbZ qN߹Qwے5 [JdQ?[Z=F.Jp+ ͪ1dpjxvBToymڼ@jxk;Ŀ˹[\=g9z?MY7Y\V:J (]wt|w7i\d;N-vB֍ .yţ ݊˭B|~{LAMu&펻Y۳l \gkDlE J=%VnB?Őn EE6վY9ƮVQD4ёn"Xr4;D&r/(_E|vWi\6߃.\%Q4܇|2o)םE-{c;mvQ3a*ks6;LE\ce rbN}He0Rakcy!+ o~ZKSg=O΍UL|#u}o<&gZQݣ j-V>ߖyNE`Kkq!\[p?Q.R2}B(+W1Cˤ }YƧY$4KŬ973xB4=8ڎfmc/[#儢We03&[!2?f6oT<AC;0D7)t%^=SV*f)3˵l3{E.gd=(3z>IE\%J)*~QJKi g  Jkۀ'WEZENzp$G;]qvgDFÛYhs|n> .aom4{lۦWׯ~tI㝭_UVK-!PȍLRDlk7}^?[6sW-,WmY8zn`&\j~FT +Άq\'U5ԩԄ^ je) 8u*Ϲ}HkGEJ.pRf ;#@-儜 \i_3R W\J +O+HO ZFaXZR sJL9--̫| ͻVU:4ý49ZZHy#UZ`0 nZ U.@ !":7vj2׫_hŭRBZIҵU*&KU*[ӻoF@e1f%Ƌ*D3GJ'Wau\guŁ{'bUz@eF<1EEb)s{0, vx@8LkmVGfg[癇;&Jg cdyëy&8gDI۲K\ѠdVQ˛Un~EbL Vr}vwW߹r_lv4Ø,Nh Ҭd!['9p%:"2"hH֦eyAM$Z)9q1ɤvG"gZTdMrQ4T2NJJ9*fRZK  1PR,q&R!eF:S#u;˝^iTZíb_E.y*TM!zƕΒPJ,LJ&-LTDH6 PwS ]%BTƌǐ) m2(>TH26Q:jU~vh42dG}v5ņ.ӨzFj]6d\ |/i) 1}Ta,Mw1>$L4Jrs6:ZJgĤ\ØrUB~;FF%V}\H\v\,H栨X[-@IT{aJkÀ'Z]RʶFh@5I> Ţ .e-%ΑWm5$+˖z F'kaQk`]gUBwu gpc,;LH5IkZJRX|h f$r/4ĆRMq.I)S>jhIžSx34` #DE\G(:ǁ^{ED(,ut:BX)i mw":&xi߶@$(#N_z| FM)@.÷<")&;GeMׂp2~O͌H&lORX&*(U@RyPe p 4dT *&aa5e XCS:jrxI%2O&@`DC!D(G%R)u 0=)`C,h@0H3wt)A SDeBh] 1N-jd2rԕASb$ "@2AU@b#(e𤠶KaV#c("(ᣁO:d)xLEL/Xș3irix'@,><T qIG) a o"0] A$ Yh&X@j*@=1!3{xf#S0v  D H v&Qi(4 +єփex&n3U/ A`%E[Db8$.p3SkQ TANGVmM8= dp_h)&TxZsYGWkML#tIΡw $kBT{=<d%孚% ^r\ QBqBq֥^3e$шk7,3Ikx">&&QCi\3AivqYKE$زbs|%bt l$js@`Y逾 &l9EK br s%ϡY nj0@Ɠԁqh!-Xz˪3 AS1h!҂ÅD- x^`e43x:,Иnq$/ ve5qnny(@q360%1BD6.6_?| XIҽ,3OS h%A ښoY6R餥$2x"k ƿ.a:J` ج}?ަ4J`&lжvvsyۜW5XxZO6]'(2-m4 ҵ`3 ll"v[wG(xv-bE6ZV\kT][#8%P۩FlgL=@Ojb{k{u3bWdb7tJp/1:ڦ:6 9~yp#ns vJrm-n1R;(X6H Ш66-HOOdPrz[ZD}(Ʈ`2+AIcL4 @Π!I[XyB 8ÈP¢1FHml,( Fbk:1Q;p\}[rO҉\Ѩ6c91|NmVwZ蹍0=5:+Y5 U9emfD Zڶڅ9Ok^nY%B:?jMo6Y 0).In4j@6(5,mB^;XJ©pltBhagxy!ϟ6)A0#J$GkPk:!H%-CiRF'Q[g~Ypi!XWSݴ\tOH`ˡmZY omkq"ԅK{P@ťFo$" ͮhX;`&зt0j_-Of-n3-KapJTM\@ @:(!E7f/=6ʘ6"UӪeWWc>۷o6=.]Ѭm.჌hɄVt37/fG"^o@T)O76@ozc10/||v&?+N=?hYjszb2|ܫ|y޽F}o._-9͗Gjx.% $\F9xK \z%`# +|1tK+B+P*tutVNDW"Кp\AtRs-5"gHJlF~բ2mϣxP^rE+`7>,|:T2#aVYO<][jsƹ 9n,q.1t̫/}4ŌStz5M0eQh6Eŭ_ܤگu:/Enh6WdS-./&SnR n՝rʹ%=OF/.*ezmjc|gFOzۯObVMSu*hhTm9|m|2c`/1n(&ChcA1ǰp(عrnpz%NWH#+y޿?`kU1tEp5C Z'P*N8#](.i~r誇[jtQ(#%GdxWVtLWHW׺ "QCWd1tEh /ޜJj Ǽ|v+j >e7f`ޕف }^ +],X_ ]\K+E8t" *BbQ ]ZNW2xc+DvJ;V9F :/'n]}rDl, -ƫ$~8M(q?e&ZST- 1kկwC ^{y5m/W'O+EVtߡTBx_ R4Ҡ{t6vy/UL+b&V]ߌj|jeDV,wzei];QiUN]EI]j1A- Z.Ms([*ՋTVh!c m~.KJ&`㊉RN>(J!=J!qrQS:]]9e=&u+B陮 Z%l.$_BtϦdJ"unӿv;\Pzc}mOϦynQcgwq&UUx~rڿ޻s(}No4KG,j!Z}6{G.F8Yu:K$_"ywӌ% [S3ڠ__䏥ֲ'Swdi诳~/g˗ 2z5x{3]3z1ܕ}=nV;ݜ(]J嘮1֨J*BU ]ъComڠPnL_0]]i㍍M1r+{кP:! W,H: n0߽ܤ{S;e+JF&5'?˥gII`/X3ͦ{E3O "ڨTuZ2[=P+]ռme%}q'<쳨ښ?\SJV SWF I&PwZN}ooo&%/јgk"^]0K4k[/Nzr?pۇ7o6V&ܗqYiHO&';d5K?~v=>\Bdf?$Ж?+o߮^Wٙ FuZk0SmP*rWڻ,tn[S%KP8tCKjyR~u6k/WTT@nssvSA&ȈM1YN}'ud5uur(]P,%N 񢺕nv4_vf'ڬhh| W) /ⓦvYWzѽ ?ԋ眽/ynލu^cK5Q_ڋwji.6?f׫ȨN RS)ׯZ޵Y[z9{vԶѡUuKh4m.6N.mj[e\׎AQf[IKVli$3565ދ Χ;jr Af՜O7U;xٻxKdz5  xvrJmNߧf>囋6*!j}VGF6c|k#W}Iצ}ۓBY'-^FdksHB+gE#p&l*6piLR."S:#Øe+3໔; !uT6 m>uQٻFvW| c`@  @Kr/|m+zXՒFl3ݬ9dUcr9+o1n\+O*aj8Be_/ &CR)F+$1-BNE*{ja2N\vqז=}K8e,G E5 yy@,(99K -0kc=C͉:jB\-@!H>>!eX$R:aK'8x.8{#vmKHdD[gidLڧcJ8*>B1b]NOǘ1j%9dcLV\LH!: ]a\RG~pY4sZ6J__`X5"6bD`2dZAŠ=qB!'Ƥ`hN\//?6: \:mo9rjZzE c$#oy;*=֑MOec4SL"=)eh\Q|p5\ھ}Y@x>փ^f]WD|,t5V__*yA\B3Y46M _4 -.KԳe whH=l{:8m;>Y>7n U}E#Xtq0Gf7bÝq17Ao6WR}OtGgx;QQ C! қhFpFDIq)zdiKe݇9s\t=Nfx]s -Hv/\0VEZ7Ժi,ZOs<#tf7 \.v}!~<$o=y2Vvxվ1P`\ #xy"ez]žnwXj(uvE%3Ƭ7h4sRzICـ4Șރf.[eHt1*KSR3K,-C=ݸ+X*i8u| v-gg_ƽh) v?c;P}JR]B#΍NdNDF/tMh0uPKh\`bp L]xQxTZ`Ǎգּ81h=7,@*/AhKg1oǃ&fD"J ֒˜/%s eD0BX 8S"$WkVSz?CψsqBF0ۮEalhN9l {CWѴ6;n[ؑ..!|]PׂGx,'=1ciz4UBgRhL2㲷%E0^V.1qHro^4" *_̃ZpiTIQLV3ϲVHaqȝ*&FSeh1eQUS@;kO`f@\t-lH  <ڒ3G&|$XyM>OGh C%DT62(r 쓵B`ZՠVi'/~;jI3ك&E,K:hr$h*BLVJXLЯq3ݞK|1qd<\9CiI@rf+bN{iΦ5LM'`y.ݬ`zhYrq:(\ksY!2EgTlʍ 4YRԋ;]bo0mD6˸G/n4xίwv]XvN+h[y@ۥ_q9ZUv ШQHI2)偣 T"her  ̩_v!. qiu+8G5NFe pМI_ߦ>i/Ǔq*c q[#}vhm.vu_ªQݹ[snL9{}[RbP;,כ'[@ҁޞ-;|;A=$L6ԣO~>^ oe?FJҾ*isQhhXD5n#Oe⍏3ӕ\˨XL%#5H9OeM38nC m䩻-BJtf 7YeE88b!X!e-vRk5u/=BT\:͌Fvqp{@^4[8}@ôqzFbpr+ iF xn:rGAU+{ O:,{+P (%FiЊRL\9 yETH gR*QR2@2%ٙ:pPBk>ADeU)qKT*HӪD7D&$;9Б`RnaKis?P#)=k%ɿ6S&HÙ/zpԃG`t%Fͦrؼw'N6 I^r.ę3BgU{F>qZu}s$[Z;gŷN{l|֘f&oWP9d6e˼ρq$<95,d Y% 4Mޮ@O '( قo{~^+HHa-Wԕ+js~Frrܳz0oꛭ SJ Un2uQ&h4MYN_J7I$$Naw01#Hg@ 9A!">:%[ Y$;w ?chIΕYXQdL`e!L#ʪDC ;ai%˦!xu@eQj[20C#,o}c㈋xYk gˠ%$V(""E#S>z4'Cx&U]O=g<k BsTZ"H%!9$<>@WsT6%Ey \qT"s z8.s|xOL"gIV:2HV  Oz`1 σRZV_{0֧Тf5CC!PT ~ay?d%TÉ^6~lq8SQ g;f3$LA6$Dr^{ùcZ%~7~*Ӻ!{6e\sljA/8mabնĬ~7L"!0~&[uv0/"qkS#I+TrmoZ_Z'3?y:\1QI}j.'qLop:Ⱦ|{ P",ia6C zr8 _2.E]N6E lEȋ{~A.l #>+,O:80쭧m\Ӵd?꺇G2VFͬxU?!džRF^>[fO]vUry\/ƛ>`v2J:BjRB2$`+I8v(*| >^7;' 9]ͥzt2]Q8$`1MS?_HB/Sfw^{t5ggsI˿;d]i:A!%ܜwhӒzSļmNؙFTZ62zOÚ>eEP<^zc#a,&e )BX?RdxO\я-cB~|{ ^EȷRq+ӝ/ hьf甋t2 "?Yi"]6kkbGkB9sr0AaQE)xPLlbI:>[>3'{I?%*̞Tn\Qle {W笤R$#JC)%S S1~b۝ 6'C{R]PRzeAv$X+`(\EV8fJGǽwTΞF!7%d$5H.* H )C$ "$Fm:KhSp>/}E_TN !ʝIAN.úr 'M=;O8GorOWgJUv.HCҞl`znV$JT)%ecBsfOKäԥD# &763sTLLZ ¡hI lqAҞ6 bM\T,z +NM$:j Pĭ,WLs lB+3>(NB]d]X# 8d(3oCU#61hX貌T;ogPP>x]U~~i1K`_ɞI1{R0A #1& 'aVkEk2-jgvйgI[hԛd̀U=^0\7w)/aYG K JhدJι/P w.wnإz,bߘHo=ՅCDsJ>K@գfQ #6*Z驎9oևIo/S^K] Ŋ}*Fb]3#T9%W,߽zlތߏ'kfXn<]ej[pV<,q +5ѝ m\M'0-P`D4Hv'zO.Y#" v)b(]0yR:+D)jTJFg+NaИӑY|Z)#{;hC-o"Ęw>Eo FF2,֓`$P`sG/fCAVG*@t\[Y 6}OirC rXE Ye?8Τs;Ν TWFbmD-jkU(Pdrdp< t`:<-PYBwXԨOe\%0Vf)Π4Uiz|?L\Lp>T֩gU$RQ oU,t|lv~lRAX6M.kjU,v~}lZf ѧE};Ds}euừٻ']0.?<-v8R'fq d]Kz}KB~uͰp6:Y^âi< `~xqmχ{:ZVE'׵Zu=hN5 K>\|4\ m<ZLjPy9~p\_.LǯN_|w/^RfNOzrw|o``~ ~M`Ms#6iZ5 ߠ]Jv,FaXnmߍiSxA .q,+EӷҔ#QmTJl( DC ,  fXiBnev@F)EșN56'nXFYreL_wz)}Sv0Emg^_i189eUa.'2U_8~~T{-ϏK7}}U`N5 "z@* O~M‹w'kW4uQ!S4ɪ§Zjgqt%>r_&`A\ńpS+t~M7\Gۅ=];͂XqÜu\ T~^mEEiph`1PEu3Ok=^ur@{fըegfc%ԋC79sR=sK)PB (7}h4x_fއj)` s2`dev2t7}g7>oDS) iKIn]-P UYdآd`$PbL{ VtI|-:ڥ }SsZSmfвt.X#´λG/bIĔc+-kwˏ\)kۀ)T2\ h)hNFFK>#ə\l_3iQ3iE k:oInAhҌtqEuK̢ըom 錧HedUqQ8!('a ˙٠Fm*eLBH+~r~78 q_@J?GSsыM~Qs-.^^ݏju)kA:pS 5N1;S唉.Tt.]+DZÇ(uOW_] &{KLWtt޼v(j]-tС}H!¥+thm;]!J!{:@b(%BvQ]+@i;]!Jfz:@B]3tp ]!ZNW`zj8s՝Ky?0AcS,Nim.t2J*dQYϕH/DB,WgG;uLG2<.aZF{Ѹ^[,}ٰyq3\/@qmrBWUN3C#=Ë mXs9Q}l2Gy#02QK-L2:FYpя. ,kKČܥFNT:N)EiEnPtuH;tvhnv(W|zp3)AV+)){2x ZCNWҕB.IWXB\wjt(JK]!` (Vm"ʕ=]]aB롿98 OU/j`U(?RM.U0Uq (dC*sDen_J?<:. \ }o1y_}n>O^ Wuv`>W,׊;no;.&fŝ=;7aQ@h.1u[fyT͖ hMZ[uGA00 '=Rh%vWT_(cL8g&w0oŋ9VuX,kRkj)24)-AM.i(YhIEv%7!J;fPKtMw4Kk;cw+DzCؗCWvá Y 0o=oRЖte{zSM+L\BWV4+mA[. ]!}[=] ]qLu.2h%o;]!Jez:H2[LAıPD ,4+0oo!_ W2›ŝ~6c%nmE&EEHd ,\TWd D+ie D{!Ҁ6;DWe+thۯ!ʞhK)Bvnq>+m]+,Iwnw+DT QtuteKΒذe3c^D>:DXb:DWtG#DD Qh=]u$l%PxW+o Ֆh~NDiejds!CψZuSۚ+hW >;]JIhOWHWXY s%:CWtt([VK'+A#]Uy`0?E547GEs|0YGE=ϏxN~+_'N;YZܗtkd%"0e$8Ct1E/mD>,I(5$%L)5j|R ԔE|%l#\;튈vO&-QpJnK-{A[v6.3/tھ Zm5lwQFtN-I (fOєp슖hj(Eo;D-E1F].eq@´=] ]i(V3tp ]a-QDR+JgJx8_!,Ȉ}]?{ " }gxI"%JYMau|utuҪ& ;#O Yzdaܥ]['mГГ,u`eh*W%)uQ\Q˚~wuw:\Y'~w?xΉs7s>+"_2m"W -/e;莗ځ L4~z8PF[ p1 y 6d}s}u;fnfhgkoNٚ;sVDռU#,/{"n^1z@Cq}er RTqTrr+Z!)^%_ٴji)k/QTzq^,f~ǫ,-%b$}5I}Ifa|? e~O]FT=jhTwZ3׬?_7_i)Tk3fW ]+YeĠY.xƒ%dr F1/;#9lė@p QuimX^<_6 <m/MBTg'> /=^C&?7g59Y`ԲiHДY,gy,nWq߭hNU|lͼx%dUddHQ~@ 3!\Z)q!JeiՔ G Rd0 -'E9=C26ye 9ppJV{T;Ra&'N4J#!pOfZE FXSlRkJ݆PCz:LL u!r!rnSs.9,1p"!dBFگQޙj¨J\cc3+9P3MtOʊ(ZIDEFF:ha8Mzs=rV4W?k^k,dӋ3܋#n#Sμ)᧕Zo~k矯Z]D1]q3q6'<^/):$Fsi0MϏ"/mLEǝ>t:1ˏU8tV&i1a/o8|_r66K4yByvCG3g:vjW&~dz? *^q{,;{U-wm %!p'&i]?cinٵެK;{6:wvΧ:uw@yK[7σs^^Wi-7g}e|H|8MΔ+*i,!GoK }߶sn00X_5Vy=mu(x3WpK?ms{N7OR,A$qNY+c3aԫ)Z*_|+mRmoa)po ߚu6(jmWHHlAPB 1dp}* Qetj-G)_bBS|<ҦE.C} L9j-B9 h4+D X.SLA68̞0dX,Hw{4=>M@lϒeY.3R:AN:H sVc fR529,K##NMھ YujNXm++%nxbd.`I-?6qzAy챙ug:dl6TB! w&HY>JZ4"$J ߿g)rb "bB֣ °RF%DF)%$emk 5AZ*,{B4XrL0$&:f*+B Dh)rW\5_hu2z!M;?Kp KıQJvw[۫.WW [9eH+.un` "'H㳥?f>΋628G ⢞: ?f3.8$&tG) #3 QU>py̨Rp[/̫H JiS7ӼY KB~^o۫qwz6_8W\:ΘNSS3Gzrjی)4>Yf}ӹ0ӝ _kկ͝f`H6 {oOw+ݮ9G:ٴe='[~|֓ޓқ{:֍ڍljKlfX^b kDoŢŇEOV}zk&GV^$m{I ե5q]i6IaG3lbyڨ4j&44;7:%r w峓Ǔg߽<<'/|F/)!xv!z{ J?>෽I|x6*uąWZ,^\E\аϖ㼉zT_Qk!a-BxSَ8O]8n[E]mKE4q+?4w_ DҥD(2LiȐQHv 2kdS;"}t}lY`c9Sq&p,If"6U? cbNG*qPe-9)E"Z+DmWVNz\[2Q3(mu ёrF P936dޕv^ml&>-!:iӿ)eH"qB2T5VIguD$usl8i𾢅 F"we>2-;qbM`<SpPK1$K's$2Y{&Du`c ƪ8 wF!w/ z67U7=RՏTG>25('C#%'x$&~Z3S6漖TJ-3oHIBU*W*2_|$ʫOlgJO$.E# JT E oVu҅+ZL9>6G倲-, WYA#A=ɼO2`geɜ%gM}>#R_.`颚n78Ob<+PT*V XQLD?1CrﴗnK]EIeT,k&!r.C#J]ih,( J6̝' @ܗ[ڧj(w bq$km "jVb{h#S6TǓ+'S>|'$l,ѪRm)H}L2!"rģ4˪g3"\$zԞpq̔|vɮ(*"pu;)Lj E ۈ "!d*xrbL ԑk!pX8<ܱ . ~EAulȭJjkcx4 x"~|GҀ8чǰB#3fS@rȍ5QsMUNwjWܵ~0G#(%;3 FI rL.1 ZGk9Fi˚vO]f_VIVj\1O]f $ xhtC/r'&nl' v>y1x,-f^~?G5ͯ/䱸mV!s6Ka3M~%Kw֜|4Ǽ$]W-syMnη#cG_K|Fl5ѠajhM Fo|R:5J(@b!cy(@u?nرV@}'YleLZAJcd68%f6:M08/e[/JBP*Uʒe-J%L 8Df2콚8[w="(+}F٬c``9VOΩ+ė{uARL=L24hrM$4]w5t#Pkܪ,MQٔYQ0D9YV M$Nxn=* Iev jlV;keRRgVPK6%㳕z4hK~]AM_yU]3ӻͶ,-Cݎtuw\q/(Bϯs*Iܠhb \";]EG\O`W|YL`\d!Β3!>IhL hQx/ۋ6q<0# ĸJ1QƠBQIp|l=^n}0O5JJ8'-4o.^cgu${(gO:w_8JY+I+1yKÙ:vQgqFGYUXJ8 %2<ƈR& lNHt3ˍ$$.IX$RAyvɩ4ip!$DXۿ؅Y0˝kZB,3!v{_͏Vb:f`ըr+`[@3IRF_i0LbLEFjp@O萐sKR9H'399n:Hr` IVg?igUɓ+/~u)zRӜռs#L[Rg5MVJ,Gh`lKR$ ΰGS G#0Vwӿ}gL>A( ]]@o>j!ZOSZ1RHK쏉H&5'^Lf#GȐ,njrХH4ҳ77h=׏A߯;m=A?/=O:B7Q<1I%xh|};Y =5X#'EP-A4H[9(My%YrtS mA/:z 8^9N^c@3 _GNHaa^q3%t+uմ~X~7-VG áCzQTfUTċ=.JǿqG!HPڶ+Fh1[bV_{.GwS2BF~}0Z57~,X15p78A ףvRGWW%t~Rɕ |e}%9y4LYv7oˬܰ 鼤{nJB1.q |d_]}>? x)lʳ lt7iHaDܻ=4O>:n. K{6i\ cwxeT{Y}mg, eY˛6xkvyDdoO]5/{W+zOZ,=vۮC?0nU!5?g众j±hvq2͎ͰU#e*ξk/2_02.,Œ-vz<[Ӂt֦f,ҬռҊڪe-&1q;Xt}:p[ŚOSeD5dW/;cayD!WL2D;!r9#z".bOVtތ*NqEeo*" J^kSDKƘDLkhihDaaBzU@ϛIr'-8d?qqm̶v/'="sh=Vp,,. A_rl ~IN|;:#BGɩ1{SmGa^p N.OxWT/jsċVtaqAv'WE`O/Pb/*.;R")\CskOH`!NǺ*Jy*pUHiXW,%`_jɒ3q2ͬCK%;$;}`kOZ~?~7]OFZMbi;M; >.Fio|JISit".©Rqu0]wJ%u2k{oя_aH 5u\r x ·u?,"E _HֆQ*z|5?=/[|l[s8`՛C:]o͂"K>6E[0Z̓MI4Jؠժ B9;nie:kI`sqɴ,;L}gQ($TdઈO\=?\)GZ`'W$̩UuUu*{qFm͛6ibOջrIٖtCu~Iur*c2mqqu3>n}%nrq'кdM߾]vZDGlszKyط!,{G^+]ahtuSݧaOۭw1̻}];+-\#*5~-ƗBnR;^Ijuˇ9J'f}f0vuF{k3ʙnjқGϧ<m_>좗sV*`fhGasmpRFe \%>G]].o fnSXfsJ[.L:{M!X4sU-A'X.Ht)"A 7YedpOȖ\F5q6'_B7*Q$-8i Ez$ 1N 1׾0Sh[O]뤋䰩ebK?Ngzȍ_ -@8lKvYmţF%>Şidk.QVne"u>Xo3uFgY:T:hcomvę38CÙG=G™Gͥ=HQ% D5Y)[JJЇ]|%5ř30ds:1N>#o Dt>f sL™x-eJcl:sYb 6n˻XZBMj+׆yp#TmUnzܢ5՚Q<O l!xNszpCpڈ1pz *`rn)G`tS18"wB)k,l؏@6c~@ 7bmّg[V_"ŠhFY@%\b悵C( ۔ b,@R DO 0;UJ$C1[%{1)n638Xqcyp~u&gcүc[d֗evNׁr[^˝5;.) 8먠$CNV(3 ѢRAtQ6{t]1!A Fx񚢝avHxhjUZ'-&[d<X(]5٫ܗc/Px @8VC_²bT@Rp([kl>)~i2,}T^ "$_CF\! 7`ʙT rtB&볍1(5=b) JèHmj2 &ƚAYLH(&OVaI+hҩ"dևד1sו=[l]3/&ɿnuATxua}W>YJ#|E]%9ü_b&dY`)NUU!MY hUx1YԞ 8&uXH<')P$,tOWlOhյw4) KUI>BHBP $&4JtG~ng*ҫ*A4Zn!S^%-}\ވ-oz+A9Ÿ^M'V-ZiMӪdټ?W~/p}fM`dL<|87gX泟/YnfB!dη #YfSX~a4yA/n<;?qsz*Q7mm[IZMGZHXe~ٟ,^d^&l L]yN԰Lslyy.ް8ѫ__~xӫ~H|O??+JNMO{>/>th~ٲcMV",X5ۦ6֡EJل=J(E>^/FyJZ7H Y_!6tH뻲f B[llt<`DQUQu_ #Q%Ec,G}`֢ % R,¸YD0jn3]`4xԳbT yOL6h]fAXSLJ WEKPz (TS= e `A&ph4VȂKGzLJhGd]=Yg|!|y(: 1(|}e 3^8d)1 Ve Z))>]Ct5ȓ ;(cE94Q>8i)ʚA;Gmʒ]AX0QZJ81?j&HvpUZD20T5VgF 4j[[\ٻ_b(LDU>X'PBJ3j V SBVNH-NDTagbQ=ک P jgx }\A>g;&]ߤ[5bz{ Q#Q c(wق-SZu䄝$Ծ{ߚ-rBF`0 E}WlX[ *X7-=@҉ 5 UF HE13w c4z&·1A[%n~`'Ĕ3IƻezwPŠWm3?B'zZ  b]\Ui[taυ.;.[0yITAĎiD$я!n#^0$) 1լZ.Y8-sI iF6d/)k2)MdQ7UTbse5D8?k?e 2A"$V*瀉:GkKt{}/:0ff"ƚɑcq0<21>}і-6 g,W=9[{cXs5QP[M(0-i^Ȑ4R`t#l m⍳%Iƹ\)BJ~GeW KـQ@c`|Z[foafdk ǖ0m͌]]Wdb?wq٠ lq[l[K&Ő-F65ڢ׌HB yV֬c"ȀUiyTA~ ,Q*Yͦ26VHm8-gk aTDiq>s(Vti`=t; HczG$ {yNF''ED eeS{X! 蚈mM&$aHuuI5X췇=E.2U8L>b-hwnK(L[!xΔCqk< T#b(0z"scҨc]9&dȀUR*uV gtlMgE|UGn\CZl&%E.hGKJaV ^&FIʆu@+Su`N(Rf$cšf1{xd.G0a7Qa[}\\y#J5mg{,(`Gy"J S0k( #,u밤EElcw(|psNrvhj &ل%䝑I/N;S ֙Tcc0!ԳC_N.tuft‡wNr ^E@hw׵׷nqT<%uul{~ŞNoۻirˎ[v;_:獖r>D|rG=W8:ϺYdwonMwפ]MQ|7k|w m>_{r>.aFE83odqgxS߯_ Q(e`9kJs|Ίbt&yGp1"Vx.(Rs*$7T*;}#ם-0rQd!ɖM'17#(J;]l?w*1<^#: ;m@<5uqG XU % X #N%! R&XH.Rg&EKsc(Tɓk:#n z9$9.[^;ZO=o;9A>|-.^~ybx{˨|ZxQiG//zxݗP+2) |`+q)ԉRJ$$rr^/tܶ~@+ 8o}m;~ qͬ ,`hɫDc((qoFUUґ~o?|Qz1Pn_Keޏ:B<޶ \E<3(wN!ӑ9)B덲L z6C@1X٨3g ͧ!E"srSy0f4NF.^ܬ;XyI?Nѓ[p-=;.l(ל58$GV<cK dLH $f:%%pImhM^txƓEUO5s#=E6]M`E>͙,-ӓD޸tN.W~kEög(Zf^l%AkD)fJj. vFl۸u6Җq;,nJo/PNlvm)5${o,Uַ+k6~g~̎Qva4=]ſ&Cn8'gv8gyC}gh5 ԒTSi--:}+~3h0\g?U] _QR2Bu{RAA\\d7Gó$;-Oz34[ K|YxwM$}%SMB†/8ww$pp^2H̗"3Qm-F·rQCb|~Y]Wq9=(ΚO 2k}:6u]~+֪YePV:4ե"6ԃ2"1I^η l|kidFFTC)[L:jf1L4w%Z{gN91rIMpw_6#.VʨeTM.ݻͱel#`9ójlVW֟4m{P]R=)1e|9BR m.L ;[pfsɪ 8WՇ XJrH{[^c5fuh :`6}ѷ){&s\m-2g`[3A͍{&^>@l_턷1i& .yS[x/7tÛbK߲u=v `$Dă8\LZJM}ܜ@5%6jEf{''Jݵ)!v/A P<)J+gWN3zN)Qj$p):Wz(́>8_LӮxcU]7ݹT~#?v#<*Ճt6Iߥ dp PRT MBEy!BpՃd쩫I{א[p^v[ER{G^vC8NeoIw"/-s'jy& Zmn9.:L X cң]i&b{xkB||&9rv>pFL 0Z3!ӄ00i!bQ}.~PJI/@:Cdugl,Q ~UJƢ-#;4:jr{l>}Hyf5[wZc=wDC3Oˏ:pbkiB(Z^:l[zw9%OZ8EJ:@ d p*| 1.Kd{ԉr'KJElOZ}){oxXC9!ibdKrJaTY]Ҋ8O&ۉ63 hTACK-Ά7uN=^p/zğ|^}mHprE(ѦU"y>=vO'ʓ~5k}w4\&)Sң Ǩ-a*,wUJ>u۔$,O4׹Sؠ1%"'%Y0h׽M"nS e0bO4S2:a:O|Fc^fm#/]$:Q^Zv,$Cazb eBsúnrKQ6F \h\5Y2MA :2 KHJJZBL\T,)Kѣ2p H`PlJhJ84wwvFgơAPmdfF#RISc0z#b ` T[N:,*w1edA&dPEͷ>в)exASNwcg\$>\~< waT58C(S#ee%g gwr8Nj9*?v/MC,v6ndv~Š㵕EkQFۭuV4շ_W?[@Z\-.|2 Pۢ%O*<"+ޫKWrcy<,I'Ec䏝)CAB]]erABj92R+c`]erUǠ2uCCVW5۫ }XE{<ܱhXl kϊ7ko3 ^0kAgIo*&uPmbȒ3{c-|FnR{1ƿ-w:<>eQ+dz'zŅTG|:CU\TּRDNP͵)r_3yPaѣȩY=_Hsfߢg*>[_>) #--#@r$s^09yJ3n> z>}nֹ`m07OCnyĐ7С/t#2x3$Qr,YáGgJiFgR Ǥ`ѨLǢ2Tj֫!#RW`F]erddjJwܫጪ#RW#:䪣QWZ]]!KTWhZp|UU1Mrxqi'̖x"$W\^i[_%q)ԉRJD΢6C):eqtkjJ>?ebIVo4V9gW e(eYj5'[E@[0OpT8^|Y$_o4$^]o&vz)-)ᤦE+Ȳ^ վ |C_l0_|rmJ\oWYw b4ѰDK+<҉r&(=8)8 )!n.:&-U$ ˁiLTAh sƃJ a4#FGhe*9$.cEh5N&Tv 9wCdz@u1e5|@ z2-( r$8SJyTL%EQruȓDcl-{6iRbDUIk+e"Y"w=)d+dhΒdf) 9R]ko\7+~YF|J@>f`vvLf?$F@Hvwe ozYp-Kֽ:<,VBPAv)b(Y<85fg zb@˧݊< !cƺ`OjEYxɊ3)Sn<|.PO= ͑& &Kga2dK^ ZI {r ZͨVOi^'=-;~h0ڔ&%\TՕ61T;e;z1q'x(Ҷ\gJzُf<2'tn>]}_/P1M،4dC pA{DjgtҺY[|.'ZN>M'ʓVIVJms a{K,4kjNK:#u"V@g§FًUp`>ڜ؃lic 6Fx%Y˓1E'UFdBKPխ9% OMu9lQ U3L !:R 14&KU"u!J jc̜= _oR{ ]0?5U %q1qYMES\0eOZ_&olSG ({@m?[YT)N╶H-EƎ̜=b^OY(,ҟNXqr`Ęj̬ ffN986AI{38[{c<[;{sYk{(Hh C#H8C>lQadJcH9)*l)FlIf"MNz*LP >N) ̯RkdlfndlUaaq(6-?tK=ڢlʌ7z[^\l}p/ta8fؾ$ʚ)[ 5"ZflLBBـ쬤0yQA`ե)b5TH Ҫ¦v>9+V($_X @F 3"ٍ֛q<[TP8Qg$i//E1sH.XD`@E3FM3I)|`yh%KGPVIgXC RYArL!R$Cs`ٍ Q?~ lj ="u[BgUaQ  zW 1y2? S[M'qFK,KMlKIG LvI3T^/MRbDlfnDpn\l1.{\6QUhP%FE˟L}jW,m`M(1sB+$qx*xlt v!!<]̑{:m:vdXQ,냵&Xb >rcLQ0GdpNΔTsJMBsJ?:pBu>4D65:)'激#%搅kT}c0S:OEvX~\r4,8[MӛMB%Bۻ7lIr)uǭ_[wۻ9[Cnqn{|xw竣Z{|0=33 㞫wtL!YGH_r멷_77rnǔo??]$Bbsm h7QAwȽiQ,6wk@V`RDfލVI+gbGiMٸZeuƬfvk5#+Vz-N|jCo:Vt]V׃m2Y u?k'ҮVKqw>۽.6ЭXl-v sZz)^>Lvd:KbT+ IXIZN-&1qZȋWmǹ֓588y4~nfO TI̔Ak$31w"u$[Tw&λbo/w|4U v`_5Nm|.Y 爕GܔyU7P8PQӇ `c"2?K>=ոZOJ2\I eU2<98 Y,J~ʫ3wp>uWn2_c/[eVib Weړ>gLlЄ0:e׎`u@iJm Җ!1Ȑ ѦK2H J{lr:S$Zifn~g=XCk˦מQۢ:0ͺWO,Ns`ٔϩj\Tk99[9_ODP`+H}1~hU5F|Q3\nj;!VCb/{}^pr^ qk9|A/Ak F(22\.,,ֽ X:D(e,-&PvAܾqn}24lHbR$[).*">hd*$.(Ͼ'au,q =0<0Nk")IBx9|*Yf%-+(D/bB#ʚD"F7Mԏ|#]u-v'}lQͩs;EU &DAxZEǤī9 mErZ )ds %&OOCF!;%2}y(m$eFCrd<AQ3Od3aop?1r<{k%Cb7NGz'P+!:`1h778]jg-C%C,RZ ny(7_jՁW>/pGm!c 1xR*:8>Xd@?msSD];oG_|xXcfk`>1?W^:>Ϲ_??]":DL lP { {ue$bOWi4g[\Rs|u5o=8r$3Vv1yIaoh>^jZjÝq(då.O;!>m3Xqjϳ<=ZWg HcL6ud#vFԁdPQۘ%] WÛÎ7ϯkedv\ڟ/X׶F3}7is$E+L;Ru)t PX, a0*f&ck?XNZ꡺=[6Yՙ[J %#=zt1kv w%5;%$v6ؽ:Lg^~\F,^Ug~C q6߿׿ C X7n ^}ڗݫE5^Y.ۿ}٦!\hEM],⫳ɤ:/VۢW=TKWczy J7q Xn)05sjW;Y6,[n}[>p?uŐtRΔT_:~=#ыO#K}12jf'w)'W-}YЁV߷}j4ώlS}-/f&y}k[O[NVV lҪĺ†W=ʙ\wmh?]Xl9LVE; {Xwc̭jD~('c !Q䐅V\[=Z4z@a5WMiZ1m5 =.h9z1 is=ߡPRRZ\9ØЃ^Ы#0,ϯ&'S8#R-I†'ؒ\f=O2|F !*нlG+V @/H NXk5(/c|dWTI(7Hy Ȑ]]7yPd%6!UbLiY3Cz(!Q|U]%lOk Ev/Eir4E]uyX)FDz- \\C[o> `yrKeW'hqwcq >%ɵRX.-Mf bbrJ8Tζ/c䅳sbFuoMvӹ0zA_~Ik.1O(M4UMC~*z֞dň71Gm͠J6ćhÈHS)s̀7ؤ)t+bdT|P*@M=.B.%DTO#G#կzR@JF%~IL7\'BW>>2;.ڗk$vq <:*{ K ?H)z2 \ԔӴwF*mb+Y-2z 2O}pLoοXx(ˎo^/!Fdg5a;>`_ܟ>ԋdIGMpiLp-.P+AYVF#玙ClL(kEeH@txS80(ޏ`o yދ@yޏJ6*ߣ*o>|=9 y"g+VDLOH )Mz*48#콎-lSF%/Y9:~9"#LdM 5) m@du²@L|z5Ie7fH1rlˑM}a*Mihшj$Pouwޱn\Zߠ?RZrHgFP[DзQRKr)D޲W1Mb-MLZeJP+Sru7Fg 5 Tf EYib|骧By9|l-|&b.4.m85_]gh)M#M&ԛ⊊5z"jqf4{" ZIܱ< phrZfP| ;ר8gvBlDR*drkfSbHE6JhU62t^M[= ]ߥA~PJoiNn jٟ-T:@+ūτWxFlHq9$jhirUi~N:VčFBA.l+Gĵ%V(kw TZ*| -b.g*smM2зvyؖ'S{2F#Zd?E{oYĉc\7;Vzt-ÅپZے1tnYnqɔ'8S]T#Z6]% _(H2p3+H=ޅiu5QjڒfZjX i:Cҩr G,y5q[ЦHn9`C\JS-+](.rP՛wdpI9|[A =c]֌q:KOMr$pPfnܽ =4Zwo(_&'Wq߃`bwowC C5z.n䧿袎N85CjNj" 4I{e`%?Odfϼy8= 4)bw{`1&;q>i `{nRpU!QJ 6ܨl9m/; uIN=|?} }\Dv&GqDXl۟V]ZիKS"!T=ٟ;} Cu_.XΧv\s ;Q}nZϓ9}zxEw_ o.ω6$:' 7k{?,ז^ LfNOo~狻_gFr#~a00GmÅ~ UVpi1NnQy$FkʩŨů+_sNۤ]ڡ2}b.7[={G:zw߿ݿ^i͟߼װ8 3҇H$qG}]f뱆x]cl0.1<0gI.zs '?zVۛ/W-''j.~j?MW@6`Yu7G,U\2,T?^xO6sݛ?cGD߸l2Wk(I(Nn6T-E$92Em}/lXBz~چ6p*3 2RJgNcðqZX15b`ld1Fg:lTOx=C/zqŤ/YU۴I lzAwـ`ez)wtjl:3eeӣV^/pWXꪓkNW:!lշޮo=+խw,"ˬΥ3gY|fŌSX\k5ȖFPErErA n,r0e龗]yiNKyf>:b#?MF-\IJJ[4#L%+e ga?5:~0ÜqjWl;ǵ|=dُ޻{R[+)+Ruq$ EeX*&xs\)&!_Ya )ffmFS*:֍BҜ*6懎מ05d4c0 'm.E\m!T0@;RDJ%cQ zZ bǤkj|j-%N%c3TҘF B8KR}9x5crIkF)'E7SA2<$e`*IJ `-RKP0c 7c6mpʺ$KVՇSʺl$f A20e*II+Ջn6L͸$Ţܤ C)QL>劍T5ZEqVyV~|m=W\B(bnYu5b @}'G1)+|=]Oش䠬C'hɐ>% V~ⲸX wכ$PLNsCs”HPr-PZQ́L`k D%L>Baa5~Zk iY!j{.1Zb!OdGsf/)c ElU.R] $7FL.9 .R6AbKOՈ+j&D:*p{Ra]JV^& D2U!brLb% a[@QQ>p*`أ6Ԁ,aፖ؍HTZ PxdK_J d0ƒ'D6d)6!b! :6 K+j)4u_`=SIKz x` !Y<(`N4.ZXRZJxhCJ. (̊#آyX<J1x7& ZV\J!Q{F<4f _> P1.*Xbk y\D5 ! ;6Qf@57]T`?@W c)y٠5z/јĎq!BfGe(*?%HvdrcRgk( 9kLYlR A-gXd(  dԕ堼.`~[G_BAO%!u» 9UaWcޣ. ݧ`OAQ$:&ʃрqBX+@Hv`uо:o3VʷgZ{8_!v #awIf]ձ&鈔Ǟ }mRmK(fyc@SUN[]+ECtA=aph7".Efz18ih>(3xёCw/}U8`Z{|'tፍPPf%SѴK.!iZ)0阌'8 vcH^"n% R|AdbLzա!}ڄVR046/|OAɣZ YLZ[Qc q{ z,:W% 9U1+QwAXE[23^w`h'mNFVE zw{s 7@%|./УLGս@R@"Pa22"<{%l qE2Z. A1#,0A%DEgHj>D[v& -uGJ߷"М)M@G!ͱk?X6 WH (m]$?<2ZHo2"JV%|X$^SbRFt  R`"!5h@b =>A(AzZd}(Ʈ` v5+AIL1^EAwo"gп={YQѬbp0'JQZ!LjM[EEHL:\O` T@ Q:A;FQnZ?o6v=~FO -rzPHJ RzfRLFALƮ ¶ծ.!]6yy:-1w*W!*>@MfkA&g ܭF<"+ (|kY8H-l4 zPW&Eυqc8lY2'ʽ 3ROQ!,iJ 8%*Uk E!` lʩ40]"ADC,0:ƒf5ԤUm$d$-8D )]Bo$!uhD;`6@< ՠ_/Zv>_nM!J4& T 5~6ʘ&k$٧֣~Jq8kD"}隟qں.'Kq\ijqZ-"ư@ c'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vx@Ɩ@t#hq)@ @F d'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vx@F;ekr 8j@@@@yOta(@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; NuY dֲ'U@@b'9 }g'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vx@.WՏfԔJ=,/ۋ JɋպI7-?A>&{_q pƸUr%K`\z郈Al~kHi=Kaqr 4:zȂNMdUB S&S'o_}1/WBe ewTȩԧ!y'ޔzy~1T >YYn篆bNC ^)d5?IhL3tPɭyyq[랗fv C>bLgՋ%m'ݺ;K-Ɓx=mp//^(yjki_ѕ8<[ٵmӫ U7RMӅ#-}:_,&s4#AFc\cjƱk Bk#{UEt^֓\-j+BPnqx腯Uu"{zPtutt^TDWU=X ]ZNWұ:F mEtd*W ]Z3ccк"".VCW7T3wEh;]uMXφ#ޜ?w`&7NhաVRqѕف S^F坬ptj+BGOWX#+Ķ""TCWZ3v"Q2]!]#^-+NMSC9k%=Q䖗SPptyr埜zCjf,(w+.z"⏧WԱN+^l7;[}m_]mN~g{ɧϷkTEZ G\kj֏~:j轔""UCWWCW~PJ |?Bo`́j7v0V- ٲ]y6=FdtEtEWIY ]ZNWrl[3]}RP]ժ"&^gtIJ,N3UjUka^KPMs&$hciBXU#M#ߝ?r%P.-*ܝjuVoVL-/hZ@^Ȥ"V'&~kPEh;6/k~si=ͳߞ=F%=yF6AИtj'D;su+ (r PBmxu g1on<q;jnBz˻_f |)\!>?%@˼3H(7:œ*bm *y [\}lWoRwu̧gg٬{'*&+%A>,}YKeQb+BBRl78/E7:&0Eiryy6^4/ͩh:tϓn*r6"L^T`z-7V+gJho5%wZԩ7lp]V۪VAj󕫛.bCU8c>7Id湋ۻ _0X?^Ю ضC|79+RQF|P@]kLWg|69Q㖜We$mq ;d]I]wSr_zq, BOFevN3vWSݐCׄ ||H&v룐% NLމ;/iݢT;^:?J|>sP ESbL{"&i{>mݮC&ofK0]ɼ7+tqAYԺqS΋n࠲ڷ™rs[$&?'t%_R,Bˋ1q}h0/dTuYN^BCIgɐ5@I=g5(BOo[\9ƈuhV WgoYyȜsԦoz1" ñ5m>K$g8wd8^۲If_|䱉>'໻Ycweq² SJ% %.Y`iugƀӡlKQAvFAZ64REQ%]du"$"Y? 5H_ Oב^\\xCa ޸et|:gK~Jůe@0خ!gz8_9Y`w?8W =8N6&Wkd4o}gH8$% %fwMOWWuyjP8U DBD0DCR`\XAzk,$K3Z3&*yb;kqg^T_{u@2G#JojZ-`;',ūkF^;!VE5 zEYf ܐ䥒 xdr!#/L-CRA*^*_ JVѕ!lqT.= \o)L t`J\ D(ub,#D΢6!woB^^KXh3Q D3n8)~Onx52%Κ˧^tod9g\3+ "m* ʢ+4BVB:қc7Cls d:o70 ˩Z{oMd xFx> \E<3(~D.o|Y8"Xoey%KHg˸ `x`:fƎPP*'D6i b` ac{NOtHé3(q^9Ae Ϝ%O''uyݟ/"h8~a>xP)))цlHxzc`*eP`Bʞ;agf]-<+8d05N 6";K]Fe539PO0ZL@ Jj,gZ VVD-̓Ћ2ٹ}Rfq.i :-.d<̾XUso|u2!OLxwqeXi5 ԒTSi-l;-~n<_`8,~W䷛Qz7o OcUdz&'WZFz7~(-wDr~gS\%>ZqG\2 SB†Ϊ$'xw88/}RpKCQ0Qm-Jd`f87Lm \7p?Î \t߼-hki]\7iAu5㔳[M- JsѦ9;0U"yLS{}ȩ_˿v (f$sJz2N@\2e2t9 CT)9TR%H`%hs_T58JR"F(ɂ6F.9ۍj~x]P13 1]X׎3%%T+frߞGwmw9uϺ%,K _(UZ/Q;!qp0ADT:1*H ún=rKäH# oMpg g5s4441tFΎ!H)))i A@嚋:QKRDtV8K(y+) m-`_Zx6:WDHyj3Lg~q1ÌFNbIGU05hRV5t3JJ--MtڪȠr grr8 g>j3EXP)I4#(@Zr1(NLN `3d)̞(B6#q4R੉u`a  $#1(:ƙKZ.Rfn8t9 _ %䒆/dM$ae&=Tf `ubzBxPPb?1rhMٰ_;8^*&;kP&S}D\gA:#֣LcD(fgB=S&V$BrAc 5al Gts+M{gqZ嵬hqyP,MpKA:NGhfBhD6qi4g~ˑwrjOeXlw q<֒'/aN2e(*Sɀ@=3iQq9@ {(?6AAt(6@K3.˟ȄkGVFRm%x/Y8"ORpK\_d^L|Rn;P!rg] 599(;I?Ü?oC,fv{92P]S?p׃ahgfT9D@Q@6F M9b1N?U/Q5ExyP? IVJl$Ԟ\%ܒ> ,j]I>uyNoDZͥ-Mm:?[ E[Nzr. % \hiP>U >FgcB+4k]4ͳi MTj|_x{=]x Ґ[b.ퟃQ/尘%H1~֮pN0BF# ~˶aX0a{dGZ> Y̻x44j1r0pIvTF6:ȶQ۞(J?y$,q|;*TdqLh1n~ߩ:U&~i:ۅL.p9 7۳_߼?̜ǻ߿o8Gz @L¿FyO/u547Ъ˧MR#{}فrkϷj%'|в5@pO.+H6eu7 ?뢦ۻT)=PNDu+GA}o Cfjq:ݯ'u$ P";=K6Di 0'THml#9Į3==o]s$yiQ.@B,KKDbL3ݽ=kkZGk=m-Sep Z0Z8((쬠RN;^&fnOC b=˨ns?転6_UcIv4cv@l `+SˆJR˦EZ6c-3e~0z! '1J)d  &d2zTKq(|;ܣB3 eꔃv\[NXA S,79fLhy QR#{#D>oZε pZp{C;*H_rzBz#`2|U^>|)>*VJ iI,GSjCQy4)H."d).5ƫr5wigY9D;vRf&IA˷3gEB%ng Dz(8 rd &imP!@&%m1QUֳjp:Y^2;>k >hϬlMICd6-rag"(YImC՜ |x)"<}jO7J& a !ڀ YˌTZ.j BcWgmj>?o~CFM"c2`hsRxʈS,1R8znLwMM S)e q|RL4tiϦ˔1D:+Ҩm""d釆z&"*|SuCS40E)>ibS=QO=&n>E;IȁY]8 SK5}FtI^Ҭ.y r!Q cn܉r@3GkD ]uӤG2%1#6o[F֜,$l㥷M |d 霴UCꐆM%ZZja*,Oۈ|Cf,A/U[qnJ[%U*Æ=yRӝ6唞inY>o/VӶ˩?z'2YG3/nn-wM ^Oo캨`Uƈ аL_ ̣uI%G?{QCx.F(YL8JFBAi.7\0`37Lb5 сնiI͞jzWo4߿"_~}5i5[sMp4}Nd(iL_Ǘ˿nje.*-uQr"5ޖ"By䙏ϑ[~(w5G5m$8x64Yz9H38 GkIف#stj\JJf0̐jSD!༷1q,F39rYUIlVӱ(NSG"<[;c%)L29.E¨L҂d`eyt)xO{uc[cʘR S( uCt)Wq.ё}j3c58R Iơ\X9YȅQbo9E}?nyn'yA㮯g33A%&A&D5dy+b#C@:3&K3d0ɯ>5oDFLx m B_lԥ<"3&,̂6FԂl7\̾v58Y{`G꜁1e|ɡ"DޠozQNDHPTڑ{(9,CE51 H`"1n4rvVT[T_ӫ0b5 #|dsaD)#;Lr/Q9KMYO(GЇ>-͏FDmV s}0TR\dBNs ͹'3b58o/.r)YMJE̋|őwF`h5h!m "'EE: bґOB$ˉ1,i!#zcb_t,u +mzj5xžc46CܘxGVG~!|VHsd;tjQrErMhD4t 7Hs%t6hP~_HTA7Fr\bHɖrLsا//n5]{uj~z6=ȝ6%qW ݷ.w}kx( ]ݎ\,}{bM_sl7+r~]͝/Uw޸r.@(?]77n6jU4oTRQ256hl0rM6cpx`#~-xЌ)xƬhII*f::!NRHsTً/H %L@(3,1,AeVLΈMd*0\frƹw eo^]ql/%X/7g(6˴F%u*)Y49(mɗP1I;g%/ MsY#Yhr>I$@I#k%+]vkW̪gY]ʪ{\cæ$ro\'6%6ЬSVݳ ۖ@ڗ>KP淌ӄ,S\0p9P[+p(V JPۋ Ko.#ONWlʼnQp-gǡU4t8#ʌtus]``z(tU ʑΓ4S޺ N-޽MpiIe ϟRx;_:mC9Ɵ`Zq]!4Jjf1F46)ǒa^;߯&:gדe@>W|DYnoM; Ba߯&+eKj-5%X&('| _O^=e9yp9o14ٳer;'1rVS[ u_П^PN6LbTVkO7veV_Ofn'Gr ZN)~ߛ&_$Xe{vH"᱄G.j{^L<+?JM95{[*vJ8JZ_ݏ0YQ|Wÿ$t/&?/w %%zLnhEBϏ+-Kt𣫉|խNle^ zi50A꿰4.Bg~׉حU|]N:]\4ww3Ϸ/^/d^$ί>1Y^M ULwj--K,ohP{MBv@nta0ntkPhBk]Pr>gFH~׹ }F)^ P!m%*ڣbnJ1n<+G !4uyN,^7Sur_=e'wE%NF&Uq2p5;'ɇOÜͯ_s z*$qzDQa"P0^a"( 0 if!_J XGo^O~8؞P\j<0ZlÍa2./4״8n&|-Ⴖe m}!4V|8נ M#H{OO(ϐK|Y쓅F?tzOˌ*k2C1qd6Mo#?+X2Ө4Uk~,^wdy^_I 'Uu~;{,ѮK 7>>E6586eR[&8S 0C)9srZCX;`P*K! љZ^ l0~ Zľ{)K9C/E 6JlS L컠tUPZ J %]\5몠w*(XS PQ] \BWV;]8:HW ]m8+k,J.{o]J03rv\z{<}=GBWغzZƍճ"LWVWwzh$D?͟Mɫ΢[}1Yp֦X / |vd{I-<%b$ O.ߏ%솃? v/?DDz$>L^E~;FWuAn{?\YdP}!I"33")R'1(qڀ"`R8&5T9rpeY3wWc~UPD¸g@pLs|05u1q{cCvx%LZMQ[%Ql ''+E~/M^ӥtBΧKUKgwBl,.̿|ӯ6^oH`?Ų۸#K2) Y-[ELL^UE_I8zoy-%:" fs.Qsx޷VhumPR95] EFْ4YM\ŀUZ6L6TC0V`M>\-(.h";-(= qyd4RƺP9(w"cd2I84td3kSXOΘ9=n~8qG}F5!5Ѧӽޕu‹u`0 Tka.AMr۵H?Ͳ/߿@[vgt{MB]e[5tk5 6B9Ө95rxjUr08d2E#jmkV40FrѤNwU^v}zF{uP÷P'6(۽0_:ѻ^wϿ{2x8{`ڜD&%ᗇ ` 'UOV/uU5ַOժV;|z)-~b>l&rkme|9/߼]X.U ը: cr/.b_.WyTߖJ!\65?Yv|PO)u K,Sgï[ǦM4q/D#ˑrAA*#cBIWZ,Wa*F¼PZSɉ#}`qK&;?|ZvF9*HSJ 8%"4ɜN[gܥ3tqw_{`{NccAk+?.~&AbI;a7Kԯ (CFw8y8 $hW7lx.o$}lY=k.{KYqdoNLKϭ<LjgyFI PAY_z(=_Tz*~ [6_x$ 8ni9cKRw\iS0sFF8UV9>8g<!\$¤g~ϽnȀٚgȞIw',AARgEt;[`zћ6Wo8vo(h&! Uܥ?ze@sJ2h,@{ЂO[1 B2*ZFT쾳#r}XW~/`mz͆vzy>+]_@ٛsy4FFg深k?߽=Pe+;h E?Rvik x \mӪD BXbz+oBjp/KonKpT5(ּt0Z㧽z"HeZ[%BBrkaGWK *D"IIA蠭9f3`'fx20-"L0F(b(e!oe$!!ZAh!F/A){ tzRRƹt[$)NڢK*C ѭ^?&pLC ȍA NmR;8g|P`BHY ͝"03qv{ W/Vܸ)ٙY~/n=^A0U+,  ?ڀ*I+9! !\F*/>_<ؙv?dGpaU - 7.1rX~w֤Qy{$(kn;ч ͑(PPL!RH()M9iϜh l@Fr"9Q,pH#ԂT4Ǯvvߟ,^Z|Q|޿0JȭUyuTn1vm/:Tg( )cU>]uE;jGgv8{.,E|obu'%]|k=J^{r>LF--n_Ij>:WbȆy Qptpٜј9\v[.XG=#bbI/DI({NJι\>Z[b0G}q B][Uyk;_0|Sp^9):JùC5 HH$mY.MFLH\D"X+L>,2"a 3%CT{*Q8A]zmK5?+m9[Dzo㵇v򪅚W"WMfKekafw}"sH!ff҇rR`Xh8ß^sdl7c[ͷڎ[RϟhVD")BU$9n 6Qn4DO&V+A =hZ !*OI-!1@Xd^ /P$N"c t-i_ Z0X;gG;xs 3@R48<*oL&z;ANewmH_ҧݫCUdwϵI>$ThX:Ӥ̇'!%%O\ la8w8xȘ@$=r0Z:3M%@$ATDˬdX ^ Xi3N S2¡-M"k:cBZ|9;w+J}BK@4H60\{CdW Zzo &amĞUor?!rDQ%Mm/r x/2$kvZA8烝ӓ&DD#LNXR"Xt ёr$Hrf&+%,&RvW3/%5qpTs*R#]41 4T5VIgui#S"u3m6pvDrL}0/{["es_U8 uId=ywAВ\)T&k|``TV-Q/qt1+5¨;ĉMul#pK4@gr6!DXM IdۿۄRâv:(Ҿ~ޚ7}$ͨJʵA_gKB8DK\{th æ{!E׍I7j]ϝ uvZu{ma_i]C/hTt(u$I2) >D0*gKR+sė]H om@!.u`h=tu^<ꢢu,FA 9+tB'=o 7{UwMrn~OuGV5`~vh.ovtcV:! ӳ~[~>+d~czI_B˥ҁޮheAΝ5yFr u& ޳B4V5||qt%.M^+E8K]LrBUȕ9nj_jWhnTwa0I$0>k9k/W-6 * &|)u6Dmr Dн ?%eQdC+.z|E\eW]=`5QR"wxU`U^Em~\$xVj>N5gӪ~^2+ӳyQF$qZXAEojᜬPf>,(?w1C/*cJyt{x[l`fcv>3Ɣ`vZbp<dbFi4Zŀ -)mgH,V+AgI[y<%8oĨ3-ZQ2x'Eky(a<2#3b(ddK2E3mBkEvnz)vwHـU{- R)Am{D]~ HU9?t rohcR_{0}#FJ|+Udp&,^ a7:RkS)6N yfM*q|v>I^BkKN ^dzuIJfW_843w}I %)[|G2x0[* `ְ`sY--l鸀y{a=}h~`b7k`<}#rOHGq |z[c^f{pxJKL/Y+h rfCݹPwP cQJTM\LY5ۦ!s\/qdZ6=k7/}y~meTb~lSAQ,@t y("wdK J41`WldL%'$}&o,q\IJ3ztבjqs7GEtnتb޳qdwYk a^KHzCĊ,)I'!A#x&IDO$S<k BsT$Z'i'yp$qhQdʑd!RڴFeݿdnA,9=קM"gIV:2H\ HPv@xoY@7XY7E? m#@QߖiN 5,D⅁Fח@M[?q1v`BC#.p%hHdLaނ C)>("K|1~f^KDh}n<;ubޤ?dT[Ζ&1ļ-L~9䧟4X9\ls .x!7g Pc^W?.]{Y h;V{S qu_|Jt9]񯝞QSs29Os> ڌ7dDbggs|ڟ]mt>3M߲5X/;@Qk֛ng[l7]- Am=DTZٽ3jvc޶ǂPfefM htk͙K>C\N㰌y~%o8tkJF&}2fRx2d0[ub`9=O XX1>;,?#)QOm=8 |g׷0V2&I1X!}[4Zpd H& BWV`ogRCqʗC|^=ƒU8T?O(ZKk1TīЊ6^YHj%S\+RMXQDcJI3DX'Z}T?b}ۖXJՕаGvP-&<_7rC+\kkK>{tqƌLݿc'BP #S" xI`&d4).$|9 Fb[RF5`v#uv0aȄ1iݻ sn;|q^VIrVXu.YY)EoOaTbS*=xj^SjއncVVhK`t$.IA  @82^&49Lc7(>,2+mI! o}PYN1#+2Z$}틜+Rŵx>QPy4Wd1]UVW^5 M;es̱Lw.Qd46sn2@`# RvL tWy B'S̄(Z\4kJ )5B{K0Ms,y7rv? lOyTMd{d w$JuQ%0ˑt3 yymN'xJ-FѤTบS2I#$l@Ui)H߽~97?ߎ'B7^>DW>~4DЭ%-_/)ڍO퀴iy]#6y_`) ܲBw&2@+mPp|8Z&/Joa"^e<MFS{'x`1;k}7wVypWpnT^v9ECa(k$/>kt9kp/W-8 , EO: smȮ_!)׮ڻ. a0$xn)xgI>-Rb[L ]CUUv lkO6:?;y-o8{}-Vl}^Vmww;xw׫wkco* [`,Wz}@4.~J \Yl|g}_j7a \AW4 \ફe;Jq'\!"2 [5 Xǣ(;:qmq}nuM(n =_|]]FPr.\?+}fqs3碠7QgdIdٞ>c1~94{tfLwv*9a1la_w%]]M!y +[\J۫prgK/5 ߗ_a|rY Zx{|QcOP?}7ZꢟX)m,nX vsa lV7w:u|/?t/ }p>c8_@(K%Ҳc o޴Y'WKF+O|Qj\qJ.7 x4k0ZvҠ a\A4j_t(qUqW˯}O*OǎW^l W^;= JWP_ Be#UgUU\~*tWzǡgʊ^\kWt~*.E{Jpܡ'p0rYֺ㪫|t儫vL3e/ڌ㮺\GsUWO:F\絘p_Ԡn\u<{wUZ}1{5IY(Z?B̐+*KQvo73Zj1;jdey^qjg3t^#s>@L^3}w\YGɫbvo7~$L3^t(ZR_* Ljifhh1;^țӵ(>e_k!ؿkYjKһS=\ooJ<`Xk(ZW]'\#,#fQʒUn\Av}1ꃊ㮟@l Ҿb͛../{A@hj:6.MPWbA7hۀZo߫zd~_{Jo>1t j@-^^^xe}߈|}Wmi}ۻvCgM[]5m Gi՟.V1_2wÞ{;V;|4SunLƛ]rbݚE 0Rϯg`|"n?(^ʶo3#`^$U~c~-p b##k$3D H*N]]]CU6eP )A$@U1/2'T-8TIOd4/f񬊋2jVb[jWB&1K"eRuAgIQH!Q!c%͕R5T&CLxeX`zateXH/'S&- pPB}HYh &s۝BArͨZQ+|[4qȠIx" M_\ۨbB\B_1k <_#9TrvND($&"vvL̗wfٗ5_ā_j+hƖYĴ@<4@-$ >:Fj:ppiʱætd髒ћhKh2<"Y7-V/>X qa-<%EXiI"eiyu,C VLtaZ,;a9{*AC%aY=Ec<ܭɽx3,nwqt/}N,:wP' ;Uϳ5 V1DU@VT%gF]V3#Hb ҧc~zo./ GUEq>e->N}}6Fx4=\LA@^AJ,. 7 G0Bd451Eg"#VP6bX:mP|⟬"h$K\*J#F\5,\c~{A?"wQ^0 f {YQc4* ^YL z΂u ZsOA6[t.q zPcs'UgɘISG 5A\AJK`Z{H_!|H&l _Xk,)l{%YHf=էzt' e)3I:!A‰`!peR?]s4*Nac͌eyT?(%6Ցnu`@"m@x%Ph#Q@8 ,xBgvW: C@!Jyƞ zr889 8BK tx…HjiT;Yע.56Cl0:zf9xyADC ࠷$ktp^;0]%ᝈFi`oQt0pazߌ:׏dzy*`΂U |U QxY!nD:[ss 5gr> 5߾};=Zz97]y]o|~op3$6ͫtT-TFx uw.fYFv&C\rZri.i6JA}^*5&pm!9NޕM]G^xW d櫜S1޿lؒ@ܖ {f@0!P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@:X%&JRe[%֜FR@X+%ո'A*D @B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P JUqAmQJ .%wځ {H*KQ J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@:`%b,% -V٥Fn΋Q)`Dy%@2JJ T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%(nkH>ޏriu^'!8r4BvuI%c.h #}bK;/\ K!\PDWXQ ]eTBWsog=]eJ!] ]Y +lD9tbde]+@i)zW?]G68!|t\IKWUj?tJ-EWbHWOmzF ʀ(2\K+@+:]eWΈ*Vp,#]$]p2[__\Lg3R} Slh;`0*&zD%JwMƳgq({D} HpvGR M@B1`2i7u_礟rE͓n=#0T:r&pL :O L耺F kh<UuY *(#Z_eaz2ZgşH9%5ףy՛~\\i?KKY$6-'5X[|}33!Օ$VNT9Vi~!ۄ;Ӳ` CռQ*#uV'q;լv;ݢV+$U7[ymUF+lF59E+?rXWBۚTs_ gUe]@c`NLjJtAQJlM1QJ#)f2)}u(Prf0J9(E嵲JTQ,/9R ]\&,Ԝ!] ]Mޖr3*S ]H*ute!DWr WR*5tQZtute9%`0/L˙p_T$]5+SˮUF1+ȦDŽ f79 6Ci;9ЕDzjH-PܔBWV)uZk+lX]5S׺R*}k{(E:@B2N +-ǻp%/2ZyrC+Kdk.F&GY4 J^--4%Shא*ʻj6t_oK0YkUnZ}U*I̫:*ÜL6oQ}Z`of9__mun) ?1e"c*et94';ElIFoY q4m5X}z{nUM\Fm fg_&nz鷺Z*Ln"?| ׷m,gyY/m NVkK!2>ڊY&I,#צGlazx}6/!Oi1|RیVu[GFi ʼ-%p5+$ ]u2JÐ +, /:BWwQj<D+p(mtI `(CH1UF; f3IfҖ UAy)ƻz_AoCWM_`/fUtOf(ytTg)]`]U+ЮUFٵ%1^]Xg#|;/nW0[ ]ZIY*l-:tu8tō⦤`M*.BT ]"]Y*[SAQs~L(az!;5Pj@Ud#?bZv(J.#[]6jer,`ZPp .5gsgR`p&FDW2g3U ]ZN;?Qrtut DWXDWpQ ]Z/ &(0iIޕ!rܽ/^ZNl*lek!]]Yb/ɻb*eŤmft!Е~dc9S{ o .sfh67Ci;6w7+tԦT " xZR ]ZܾtQ C+ƨ" ʀ-+xW e殐^8\ WR }MoC+KUK[=Q̱̋Ori̽[0#;?F)LRh 9dHHB(Z;GNziE'djF[1iV{~]^E/\#ǓTxtmշ6pp|?NOOf͑Alq CΘJ"+R4,_U=/y]?g:L0(6F=Y7Wef^=86n|s)ox3pW]x~x+^ V|7J>㯞"6zY?N׻e| ~ V%ɠONb2E cyrAXA,>qg<Y243ꃳnsp}tUKw}ud.+qL*h|ߦgIO;Ǎ4ᣱ&B Qy "FE+=11獿 U=me~1#2 iudtn0T5g23߼v~jshc=;Q_&vuU9 ] ݨ B\lfxο.'U 7H^ၷ^e[(l6uQ оՂ@ˮri.]U EUoꝁ_JMPxf++MM!kĻb|䳲"b}FwdE{]{G7Σ~Q! x9si[2s~#d䵢NGfpFH=D:H S"I㳿DIڸ;u5W!FSM"yaiΤ3%s5'S)R NP؎y9N6NnNTGz]W6U,g!/;X隱J*mEH,ii2jM-cED`vֻ5L ߗ'_8i{OAw6^D/;V;4#׎s5s(ADyk A+4|Vg4G`6GÁ)lyݏ'.> e~yo;/}<5l=</7m"W)H>x0oB,7ʲ׉y(BrLC_DFxFȶ0c$$=t% P*D D\DW$NWPpj5[JcFN2F>zd'--lr^Mgwc.+~\"1DpZl[460HFz$pFO#8l) sxGN#E"٠=CyǁPU%hBHyְ~X? } 8+㐙m@݊)j[bǼ%~ "xP}|d_#1-.>yd.S$Nx8wA,"QaLKR䦽eg|s}c;Dgi,u.s0}UmY+fT B=,\i%D XVc#o$JrP/`괘_q箥Q"QRi Mĥh -•8v+;nyrջ@>У9X!C:k(1*d~2ϖxA>\MIY!,+`79TI3W}q72Ŗ{f][.]w&\ AJ#b6S^F~~si/_wq{3jÿ#)TPRGoHvØ4O~ ҬO9w')#sx U r=1ѱ[e>1v24 !Wnhzm? -a yg#fe\3y]nhMX qӮrfE=hCrU߳u8|s|0eD~ qPYάJZH8)Y 0bNkAoӊ}_~F}~~F-Zptwem0xNekmNsɸ{3(3%&^+D/qL~m䭝59YA"u-bo(PEpiNqEPy! )$RHbEyh,C+uelFK1B(>"*@'Da-ȿ C7H!֫:fT*=ɰ'.w Vnr{X=X^NO.NP٨]] NE8$<yQ֑k"K><(68m)?{~^#Ȗ w3*& <UWʼQ4 DQg1My87ʌ#Ő}5}.&n+7(GR!}ug}\nX7ޗx,kCȽ㮝s\JD;%L}pnO]xP w6bŷbnƼj8{=|9n SZPeCŸ+=ژH뛼<z۫y\oT4Ǣ7wc`|e~bzTE ~ߴ޼>d8rjt(2n!xl r|o ڂq#Y.񦡂Dr; Rtq/k:,'"rBDrDn AK73)wߠIΪ.K.į:wq7׳dc/[!;J~/5>nT%t7trt=9&]NE=JESjx12fׁAjH {zݳzm]ru~ b i&<7t4 k38ayH ?`^|Bsv/ /Ro㬌luM6V-O8Gx|Bz: l#Z jo_jA`9tn.Bwx/HN97͋_^7oN)3'?9}8ĐM0M Xro~\v7557Ԫͷ-R+7{ }˥nX\q3_wǵ<LeR /nM̿`#_TrrײWPUYP Hab`? / po-0mN"|unV2}:( rASVcBIOZ,+a 0/T9HZ0'ye=Gڞ'(T.8U faqJ)4o$7N+6JPڙcYTDk eq2|酧:[upyu-~b&JD9 {w Z}XvJ(1ɈɃ+}Ez- zv3{b W遥z$ۙU B ʮ^fӿ.:8GKW)M–N @ḿazo{X%2-yxϝZ`(P(C"Y*B˽^:<)ҪKIn]i-HxEHɊ1!15ϴ҄kDvRrBSi{VȺ e»S=B`ak`BXLub>)|b |{( 3a0*ֵ}gl ϪX\&k'?ċ^"s(\)W_ Zڹ~PTqxQAGȪ>$Uw8 >GSu6cA~*m7a:ܷb`?E^,J"l[*wʫhÌ-߫SU~^Qzά۟O0ߨyk1;?o'(N^*꒒:js}:?,ʪxtu8|fsKKX@Z8ڶ K'J-MکH@\trc"&!ĕ~Ҽ }Z u,Y30K!{G /WءrsXjaA+%D(|_od,Ʀ҇dKK7qˆ}*#fd0"FN&VAsRQLD$rEuxrfi O. >28m6|,V#}72QeuwJ^ZnÍHl8 Y"Mg֩$1;< -J[TEB$;g%IجC1ӥi tL(DL@ID$Δt qSQ~8' mi[g;/YDK+2gX)SW:mTnv9BbQQ\Y*u7 >jSu<FM -ed*% :K7`I{+!#*(#cO GeS;3vofz$5,6&b$],Q $26$cvdlsh,iXSdVZ4w1JIJ6OUu,O1Xڮw4W{Lr2BjBLb砃\LqbJW1e1*JA>[H޲ǧf'r'ZU?`&:Hk\^H~/>z噱0w9'M'[)xU_OIK&J4&Xz#;*қKoۊ!R Ҋ )@NKk"Vs]ɒD0l"x,{zWȿ9YRJHb. BRN?'"$-Oaik)9^Ж!;DY-[n2)tS}qg-sЀDE͞3hђ@ZSrO kpe71rwH/oqZTkXdչ`L K@e"FImRgAԠҊ7Ǩ#EeP*_8)j<_})sCgqY=.'=>neE=;O3HL. VS5و9"Hk늬GY ʤKe_J>t/OnzkJ(҂ʘ C DU,h sJtأ6yK_?[;[dƃW&EMhKYT*lPd-r+ˀR` iTFc)5UXaTAo4gM):gNd-)JW eVDHW źȾ|w^; Cڐ2j}/}F{cqD*ĩJޞ Xnx|6ƦtVFb=d!ŲEGumMe7K%%/ڐlr` ں'+C[]>7ߚnND.W "NND2G{iL^H] ?6Z?X$%iȡvzae3d2=1PIyyd=Tٞd YJئ͟baL|T}lvᇩw SGQ.x]?Y' lY#wFH'L*ʿ>x͓j r5yl&@B;)A)&Qh }YgO/|` D5BU5Y dނ^J-w%PKLRI6|ZQ@.lσ>y]ɫ5.=MkYχO뛿ofp>Mw묯Ejź&-@2E'DFUjj\Qh(4Kܨ`#K&c*XR^R d. ]RCE CWJ(Śdb)cN!FE*rFWY&Z_0\LΎyEvKr`e+QmGWkN"Z1~YI0SE@)`ٌpJzM6T@A6{r9u19;ĊZ[d.:V;c=#j`f)~xy %֯^(..x;bko罸Ryc9bLYhΔ EYXȮbPa$(STҚPR4g;ږ\cdU!U$rŨr( lAJ쫴40Ucb=c_.[`,v--n>\l<}WS8?k<>3]MZ]rdH!-! P&PkT&O^W^JVMm] D ,,*KF,%abˋbBzǞbP{XCˬW"{G~N"`{*z`LGFZTFh2@^tkr!A(,(F,9Ņρ-&a"V VXhD<"]2FF-CEeok5!G,z=OЉG@@L47.jg[J:<[t*p:8 I}sLD\L .#qq5:%rQ-E98x_Q-P%9Ali mAH Ed9 (+7\lVxg.'@- ~_]9rZo7u@=u&bC܈~Ż7'?l섮wAoZxTaFɏßN.o.4$dw" kp \OnL>;_7T*\xD&PtOl IE|w\_ǘ_=Z'n|]\I椟Kz? Q7-J3ƀB RWZHjuĐktz#}u~7Bno 6{M9:y ֕)$b@GM0 OQt&Stha:NNr%;1Ԝ{ԛu@˼}ʇ޽]>L֋־I4n־Ym[urlwfv+auf)\b/bڵ+VYnhpe(7 D\{%u\JiW9#z 6\\ XU:B\y/@pł; "FtvjUqrm%kl<@(B!Lb\\ᎯOwq5K>7OpEjUJ?puRh{ Hn+Vk]b^ \!4RtMYdo>u^JS1s핲a~0riVuLJ-bp:2c@yѥ~.J(O{hk|328UUZ )y) eg ;ג`#YY^ZVk*q̵G8Z}G)v+YpfU|rZq[W,^pjmbnqz+ FW\AVi1 %W,nprgjU|1bW/ũdWզu\J3֮^Ԏ^ |Y CG͒Z橕< \}_82Gu+\ZZnqPJ: V˕\Z-Z=pu-: VOnoބڨw8Ğ?]+׷|#+A79t}MC^~ݟxu;ޝ%?Iz#Eeܝ֜@mϻo{ѫG޲PW.?uR q}uV^5wzF.km݌C@{DpM{/&(%U'n}gFHR< 3yH;!YcE pqWTKapWhUzB7%/ϳoK0yXSec[Lk+e~5~J\5O%qzaҊjIJrχ=#ܟp⩣@W>qa'?HJzlWpu׮ AN@`IUS$-WIJ[zpE0* UW \%iABd*KUks+I V \%q>JjrpRjs+E n @\NL?JRv}9•&/vq2lhF㤖 L._]\<buo:$CP9.{Dp=er>ODH9eA\j]}D?[dP hbRʽ'F:qP\14g{Y`}{58{Of~ F(qdߴ ޭ3t ٯ^T׳ 31mwcPpAyz, |\MwݥLR~B# ,0$:$V8@JZ,N6Ϊ(YrdM>ܔ 7Nr`t 2'.$Vkq^w9+٧o oɴ?b!۠LnARLR^2ENjm/\]8ֽt"URyЀoROqY/C3MI 7\5ir5^[&6fP_wj Rj>~Mz!Vlykozָ<'7ŅJ\LhXc! (- GPZja hٽgܤH 8>\_ 0f(Ig|i1T)b֧yojRz\Tkn0>8S* Ϳ:sgV=j cF43r~ŃK4J˓*vo2R/I p?{k^_r@l 2 3;ę^/^CP,r @z&7syRދQ72ˢm.' y?hhhġK)#znǛzo'ȑe@p^Ԭ JaQS`a[GwL)#y 1?[iv)kף? 9Bi6 [JL.0"~ iɑ(hˌcG(=٨9Hė0R%GXH>X{%nHŜ(aH1qv#D1gA SG{u sp.N|؜fL~7i^4^I%2L'~ "f 6JFTabZmx+84Um bדl7 YbK^~7OHTjʿ2ƈ"+g9><הywS:/di l:4Aܩ`3,s긤0 #TY~1F#aR/PSBXYL^q30+C@-t1qvǂ8q98oGH/u%j=ziJ3 jU^( XJN@yE(|z`1VY@1ZleF͝QFGR*:dmkygIܰxwUi u8x^T^TyEוo~wEC-tb/ѯS,ȳԤ~9ySຟ¤f*E1QC~d1|.#?x@IԏTGG<*qA-}N.i+kd'% z0zW%kXљY㦆(<=6ho6t4*C:Z)"F^yp#-jAHGkmAYnژ{9M8kt]Å"8 ~,.M{me  Tu0D[DxDH\q -0<0 !#sIIc>@Q\AXD:tB!h-GΆ8z L]6մ[[b{6Um^8A'' < |VRJNLGQs= Z)$ J3 c0:O33M.} =E yRd ǁ8MLARcnbq4NXx^%IGkItݸ^E#[wZ(dH*\G ʁ3NZ&̀PmezkuZl}G!]BDv0B6ėdUY|^?NZ+Jţp+WqwNˋO_'w^:i\:>$MC7ao:I<"96(,ʘx.4k &myTN®~ٿ/(t~~d$m:S^ p6-,tp-책"~ONt jCI_Rʆ?lxUԷA篴"I ~N9Eq^S("_ Ou[3J\0 ![E3-v-8C)W;e pܹf*hu}YIfY^UfcDtĻ<=̙$ڛ\! F̡;,.. >Og3.ɯ8s-.=*8Y-Jp3XxR8"Vt )kR @ALN)R|.4~w߼>k|^:Vvi q++ J$(|Hy y+y/̞ś,иzIMU4µ~ 9X>ߐtCV,rHV/B?F43{[;[D@qQw2T؛s;8?Guou[S-߲H|0ɒ:tY~1fܳوE6R~O#Ff|N/ $1#PO:vqˉH)f89A$\z4ߝXT- 3rx"Tk=wD/t=wHs9t[7#%,v4Pjzܑocb(RMԸ"9=s@U`Lmz |H;s1ޑ>EV>>R/48h!JΥT9oFǨ'h916L+Έ/ΨŶ<n@t> !)R )aN(qPέR+`ІRQZk>txO:Aޘn,M'[nj!(9ҫA^xXu;W|Rl\6rWyЬof[+7rQNܻs&CBa/wش[D[x)e+[NrɷݶbFڲʅʵ+w"{+-~]z쵲2a=<8,%g|nb 3ꄷHr|1{80 82 Kg# ͅaP') %fO|Sa7e&$OsSaR*y\7endM{w+]O';/*MhKҔ|^;hf?Jܛʦ:lxͮ$ yy͔ B@0ʍL/O:Cjv-&C bșq=JVUv}JpRY%Do c]b[u+-XxYY\*zj]ƚQF<6(`$ )7}f`ǩ8wӚץw4ZyU!J-}Uqj|JF4L A?{Gnl| .~ȇ]ÍaEҚX)Q~OH#Vr˖ִa˞Gw5Y:d*jcA裃"0V#6hߦ7u}z|d|H$Y'mG^%cNօM$W5~O'Su.7GM&Fid :(BhTUF@bfS@yCs ~XA̤ƂBgs$2d"̫ k2a8qcGMxHYrɬ kItq2p/2Od*P?qgE5#m+d!2e Dw <}Xw+mEG. ORIXVXkp`DPHc ޣ.O9g^Ls@ K/sh#T˺^6K2 JN%lS5YSc[ O%@@@|@cm!I1MH=[N78- ΃BwP"<)Y c~Xdי4H̔d QL ` ZA 78 8\-#GPaRBf/JY6 NP,ڐZ?c%ϐNڲhʘ& h `fxfAmJM)IKoETĽ"nZF7)ݭE(a M-9K/ RmryX~=hZ-LZ,6:f&j ԭG7@7۬ID3 =`&M1NEEmE&R֚BQ9O)%. `L Lyvް[N 3bK*x F9-)l>D t#bvzA,GZ wM9422tQ3Ec|RnC0ֲߤYɰh"*O[WTb,M%r;XI'y'xyA ap`R 8Fj,xHyadzwuSh|Q0YQ1(jS{[NkŒ#5^Xf=GHQZ *%i6D D D/ Zx:-秝 4)MV7[E0%V$-6B ZxX@TFX6Z4`䋖@+u!1f`JnÌ G9>X*IEsmM|iĪ0]"_ 1 0&DIF iY/>\]|}Ce\kٸŽtu,a&B*"I*$%$AZ˲&ImRݭ)&ݚ̜/'S?cDlյlɂX:9?::pnMoqSfWK# ]7f)>+^I0w.t4UJѮ:|:OQ7 d쩽1#ּňpTj.I :0"xH{,ٚ )2wG޶s=.+=ǘ/4T+-6LrOmÈXK4URf>>;*ÏX{xAGd*P2C[ c2 }@|J@08# ڇ1@Vمa^G9,%2‚tқ9L]_KTS,\"x=ZIc*mR|3ﭾ{ÑߺK iG\OAMxORa;vMVh_K^ϭ]c~a$'N 1Ph.%c`]K0v .%c`]K0v .%c`]K0v .%c`]K0v .%c`]K0v .%c`]ە`DT)\`N5u\ڠ}͵akŚk\hF1vq4 vZU5G)LEPqNf߃xae7+umVUEHR.A_j- :Q%@]]r:m4`_FʺO^tˍ+s$ORKMB[,,U RePU6+muh#V{؝ )Y9G={uٔ/g7ӳy%~ڼw룋_frkbَ6mU^-O?pmcww2;ެMzЂ܊O/zuuzwUb]/rcetWbz9Fu#ϟOΈ/}6q(d#?h;.l.Xnޞ;W?/{ëJzw8C +ۘ Q| ?}3 y㾷–n]ZϭݔO}{W)y %USdvA۔; _~y@x? 3hwh8|Pt]$\",~bW 'R+J q\^DH_z2͇8Nq[?皎T6 2 EzC9 ٨ٖFGkM_әdMgU租龣:0Oܘg{2\tOI=sVv~g\= ~_T] v.|o] v.|o] v.|o] v.|o] v.|o] v.|a_'Wļ앲Hh/_۔^BZ3_ 2c`9ϛbL o90"oWʼ&},ܩzoW,'l ޣFfr{fAyB }{N4x6[+s)o-yTz@UmK?͟j9!:*nlq;RBCw?,M|^Tyw|Uwr8zw$\QnVEnuqQ#Uѡ4qTګf#bTD<3`:a?VFDD49,Uc)9ӼƢ4eM:{g=젦_Y~dq ,IP>\5t)ymQ»ZjٻF$+v1V/ vc<,0 #xERZ,M%b+ X-JȪQqNS SH j}O&ۊ"<-<ғpq6TH\rworP7y15sד3ô^ȏGˈ_H*MUi=M4{~Oi=M4{~Oi=M4{~Oi=M4{~Oi=M4{~Oi=M4z{ K!;p)w2dsV lVJ毑l O6*]#,]U`+J( ,LM>{Mcq.nhؑRqdueJ:yx-U[WhR5룄eG|72kԼ*2Ϋq/7W+x ~9i?Mꧮ?7i=]u# 枿12uF1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s1s0CV~`t~P;fQ{9/q9!=\9=թZNG+kDWH&Y}ㄬ ~\K?H }20*0𓱜^&jH"+$c4H㵓v(lL>n@ @]UkkMm9K_D7^F̘vO|LnEѥ8a)q"C Vɋ NmXQ7|NQMs_\asqPi*TRDH3 g2<8)#QXag2*:NgyMJ;<\y%cȟ lz{3X&i|[nGtp/.ONށO;lp!b6H 1, C}RT3[{de Y IN,Z+ *G IG;X!Y #ƌ/a@Nw4ǯ2ſ;E"RF`9! 6yUhM0A).&e]6\l s)8쫪;h~xW=ɟq.$KO_2/GK]Fao+m)ϴ ˥`mJ@Jm0#DfdD JJo Z[gie&h<Ӵ"TcZW`591j 6IHw^ jbZ^C*/$)k%Z[_B_CTExZE6[ Oo+y!V)KʑAWLQ%Kd& :TppԂs1{wfKF[vމuZ]F;[A/;aj$Ud-Q$GPt%e\ `dM %>ׇ@!&&0t=s"2uQ>QM;DOy2y~5!'5.~=ųea24^hm3nؽЮbn߰ՙ-UJפgP)AL9%z*:nJTAo_oP@5)PoazmF%ܗ|CD]TS^F Vi rB | : ]uOukM5GAV7>&`JDI~.6i[fYqOOX &PZ֭lbb鮚9Kw_G~{ud3z͗f3'v,hډV `6/ JyB`0dd\OEVzh4MVʊswSVԮn/brK[&3P̖RRJ"D2Ip dd8XnMbhX!C=Ȗy^EVJY׸`6n)^[SR*!PM֦"He#nlRq'-EzfZxƔbX͜ݑV ͌CcaC_)Xg,6EdCz[6vξr'y9t9ͿmJ@m)@l|Ȕ+Xف&-2+QWg.ȉRi62-.#.%VCj!u}UF%5V;Tm]MpYsI6Gd¬DXHٓMdks|m KSaJj9MVeZSrƞivB[JG-9,ޏq!0z +98ff7q#Vȗ2qS)A|Hp KG`N1jOb8lu 0mI>_M')16!': @a=}~ȟY52;7o8y!ἛK-a=wQ28!kjP*1%?//Zlߒ^^ ɻ/$q3χtVGKݏy9W+\^|DžY'A0im[7jSU|Y8<<9 yWY"aW+j n5'#݀抓n@k5tZitknN5_o0GAr*Lߎޭ.4/Fav8UoxK/G7w֣E amU썔Yn-49=0 q*aUja-L0-vk;_vqv}4pJNQ h1`t{Q؟OƏN:.p1sr<>aKCs E'=5ݗQ-^6!}~zh|)0e0H<}DpΏه5PX9wOt#_oA-+!i|*@%EIOV:ٞS>w8cQCR̅@Kk瑆@B?qj)w bF3~BY ,d)h2=KA+ DR^O9UP55'\v2 n TW N(\tԀZ5pV6t*Õջ Wh;deBJBk7@+m+'5 ˓ Wh&O WheCWhc-\pZr9=;673p3$`_u}?=~_o}+?eJkFnlRǗItz=OoNm D{Qa.]‘#)*Ψ2qku^"g`dqmJT)+L )Lez_ed$JKƿ0^e) qT'AK%F0 AOمvLNjhJj'@̈IYxL9hӁ%཰HeHQ)!Fr.璦6e,V'3)sl)H-sv :]he(X+3+-;W{mg^-iӧB[\e(Npnуr͟ Q.J RĪ w٩QG/ VIHL]^U,1GbLR=koGe/9~0^9M|A\Ka9e%!EJFeOPLQ9RleG*sAydcS"kp r%0!s$#-3G98IIտh!j|)'ߑlÂ8__qc4'DL)>JdNF8 )ū``&]F8: *LL oͽ}!M=RhhdK6<ŮH9kNy"F1i ];;ƈ+%u4I}-H8{)yD=R r?HbA]=kߚT/ l(^ T09u\RЄǑ ,&U0bXSpgH>HJCDD b FрG!e`e5pvL: ~'>Rݝw>|9r5s%Ze;j>&'z< `RifqXC-+#Q VI 1OOX4* !h@QAsglrD`p']5pcZ> S`غzlK)z,Wte%QSqj~ܣS/Z##s ȑċH aʑjHysI '0Lag? JRc HX#$V+bE0rsy)}ǒsfF|:oY>0(%WKX}mRrT){<9\x* IjGŠx IJA8% +BEStc0<a01ABG@0f!$&| 8PxaZ" ٹh1v`=|-T'Nw(3>Iݭﭤb{fR-zq:yG,bJJ Qe4eDC@#'(U80DpGx!*K_?Vx2U%Q.\y.Z Q!DrҘ^]Z~PE3=Iݾ_9bO_Ny`xo*;M0KlxG`s0ɭ&WH#Qs(SڬZ`ERaJq[45͕$gN\spD&x_],O4zS$u'jSTeT(]V\8^y|_AujFdC8ߕk^%`.jߺoU)Z{Rd!V-T4 b5Mpy7;e!T+{H/|L23WpA!:\AZ",K`79}q-kbK"pBn-W .f0]\_]ָ8ƌ"GJ=w\/o]zN4+.n}vU,mC:}]=iؑg 'BGA6Dɹ* 獕5DmB#IiE^엟%_QKƶ`zx8 !)R )aN(qPFOBdXC -Tt(ʼn`@: K'œ=_~[ykgcEAvV/H݈j:;7(PҫEpi3&(@.zS|5j"k?z-j^0~kAb' ^Q),î-{c"[ʅajCx1oy'zl~/]ÓI6αZʄ2aA}gdnDh:vF6zJ̏󞟠W 0%eT̮^v"0 !~k~yay^ּHc3Rc}BrBRHQP  PgTeozU=!V5iw1;ncqH%>*Ֆ*bfcEAh~ۮe (WKM)kZFe)A@Zqy2ΎMÔ%מ!< %[ TPu@`tHjff_: &uv~bGdTaXK"2UUG$m o"XV/(WȋˑplphJ+kFGB:"8`R&%ߢ}k_uxJ ʠNd@BXGTQo!%aGًL*> }EEKM^HoL~c M gAL7s}z^JDڹ~P.L 168 % 8 ˿Jmً~I|ZJ 6`(aDxV)Kzⴥب܈@dImvx伐B+5+ ,_W!rpvD(z*H&R0#;Kw{5 caް-+=\:癈5d Ske -96<}|o{յ}@#ЁHi쮈dNF04N+f $Pxm2OIJ;Yw9G845,xBV"fBIõiHO%6Ш5g '9v/El00nY-td@[Kk U enP! .F7"HwsxMC8ҷ(j6Xʢ ab H@5VqE$q} .6hj&L|:Źq~}.,I8LN}>[EQ''{T/ ~}vCj=΍D(pelpfgV?`$Od+B0E/F1u(™^S ga108UiC(XK@Z.\/nw\ G'e0*Sc,kؤVN>ܔDlQn_u%{ӳcTT \N25q@-6 gpO&ǫ| GʜYɦQ檱W'H0Ǒ22!,>ٸR1fnT,O378? G~m|9~cL߿y7?`%L7]&m ͇2hs0.FmNaC0ZRŕ["W#zPTtWţؾ5A"$++6Yy2(2Wwq#yy U|;% Q( 7%[ys_fle$$ BKOĬZ*&zB\=" )1[0'yy=zmjr< T:ʨ&a9D !F`\q"FSH9l:t*fkg4 i5(>/#jҋUsyŹO ]3l &c0ΣYGy(򹖜N)bxJcx+eJ_st 鼡;oە_sA?7\KD,Z GB IʮabOܟ5Ց#ٿBȔ21;/;1O6K Pe` qT<:)2o#aQQEz)h:)n*"f@iLmô#z&Ex?-(5%4#ʓ't5FB8>:ZjaR.%aԢ{oХ7hsXB ,b|&SfIB+Dٔ|vmtIap_zV7zBOג˹RKXF3\27{HבF{ W֌6g6j3GLئE4T)DK(:֨tי;I'tPOBg1A`E4|hy]vb%Qr%V] ՙ=-#0R2դb0s6T79%eM>B􌶇6XgG) /Od#]՟?go];O}^tP:LǞT%\T]Z&`+1!!$H΀|8U~WsVo_[y~wk 6)tS0>)>hUV #U E9ydzjF*ȘԞZ .-Ro?/vy\M-4qh{ ȥ[Zgې@SZ Clgigˏl*X rHATۊF.hE=&[PL6\O5o(U䱦9sj< 4fTm`mͣ+ƾiϏ4'~F|d#Ɋ= 2)^s$#bT,}KZVWǤ ?HY`[Uq-V[ T=:ڙ0Z#skѻRd:1dB&ƭ^Ц+ "5>l|{CQ1XmR~ZM_'C#,afK&ޞOgWbLoSa#aXPw6euL.ސܢַ[~R?k!.ظWhNeIu͏&$G&Y}(C[QFUMA֬&V͢^K?H]k~vi~b3&0C'+Mu $2BZ kq?@՟IvI/Tkc[l>cPwՍS6tض+ 1/Rp&.K4;<=m*8S?>wp]PrרxۋҲbkLH9C !3ve3 M#ҝW+2v~{tqvi&4L4ݫ.'}\+Ib S}r$mSlfx]O!C)匱 *qB Ee2T[Qv5g%C* CV͒&\k qjlӥphƵbbK]dL왚NʞRx|X1 gTI)yzVN|ld\Y|->%\9xX.jxS5r5<_Tj I$XH%)6appV"KE?N5a!vpt}dthʳo'S^ݞuAr2:rhCYys%Xhg9`y]*VdI$ dXsJ2g3vmShtD9RjBilPkC-xAhf~fUqᰑX.5 +Do_2qdsqz@*v? *~t5_CJ+:l) ՈjKDTȦAe1"xɎ\y ًjuӦ!{H+,ķ\(sѻlSaa㑬ڋ}I>d BkVc:2 &ґ+'""Ғʇ }E/:kJ@Z^p Ұ,Nu qapõSYĪTq#>ˆoU-n444&Pȷ$͓TpY#A!>>n>W%٥2A՛*:b0wޞ̈쑀cWW¸9l<`^4 /.x׈`VfFZ qjb|C0Os夵^Q‹Sac*|P@aWwd:s^9j& >cc0uApKD?Zmsrx@&z2IAM%W*h6S%900GɄ9>0ǒQ,xYbѹ:qS2&n$"A &0LJ9>ʣg"R|/V]9dh1d&ooH^!jGE4odT2\30_~?6Rֿ痵(kb-2=ʤ˷[znYzPޏ]"fɪzpXJO{J\GhVm-_?֒WJ_k넏|SNUYDk?_atckIrv Ŀ]<./ucEوî{.u&R?{9םқyev-?6|5L`K_/7}MY{6=y'Չ::ݭ's;>ܺǶܭujs<^=i;yZw=Z/j֝w;ڻzw>3 afSbo/cwݿutt7+"9+C-aWZi`SY _4Jn0s4ђzَRi Vu4JԳ۹UGKzt%(V]1zs$fCMb(/s߶ZInz(p¬ѽ/Z׃ͭEz;F`w{ğ|@L׿~֐r~x^U<.+ ˟IelDy1םiap,20]y "]:tA/"^K s6Nz`8w{ztxz*]6o-oj=^[o&VEq@PFl!OOR/M_c`20hM%(/ j DaNt%gCWnЕ%mNW ]y!JUGk骣tzt%J%;#Ά:\BWm<]uHWJU\誣;?銑v~i6t8-骣\~ {zI:]=t0^^yZt@W~ǾzAkFt%U n.tђ:] JgBWodDW\誣e:] J^MҕfDW.hmp6tZ?hN:Jt ]Eb-sQTF^{Ze7@vQ |:Wz~v8oluF 6?3:#&:9tS&Y}(t:fZA8]rMTɵ)CUf^_^糜)o6yܴr%XJIohyC#-cNj@7kV f'\)Tz<\ڷ&mfuF;hwQN-|^;À޵qcٿ0L`f`dIK2FeZcYHv,=Zjlwɮ*U=ltVbWZ-WJ&}pe ./rꊹp1SuZǺWdd/W`b:O+yW_"\Y+̂bcWeC'v1iZWJ =\}6pG6=dYC{UigfWY ϔycyp\AW6=7B9vApEk.] \k:\+ +a Ny1pURX+YXzrJPHd8` bઘk/ZxXzұU1WB%V0B\ss+eL֔y@  3ḳOZN鼠ֶEm^m)lZ&&o{al"9A'/$cCܻ,=j`my\XG hѻI*`2/tŪm_?tob~.z2ߟ~7H_<}~0H_C|K%Wat\OW?8n]?0x ߎB?qF/Ov#~ HwS'@G=隟V'͓w[_z?~ٱ^L@-n/y֧Ve~燶28(<G qҼS/YC_.yv6<YR5cI2)G\R ɧsZer- q1~2gu]1ݜ.kۀeh0#'Wpkxk8PB]R^S L!oKn]M?oߏ֛56,ޒŵ܏癿?lE_X_ӍeɅû=qlj,(P|yISH쫾~\ "mm^N͘hRk˚IW2dV|,`B !̭#JPjͅG<&I\VdR: yZ LfS >9 (5@xת5W qR_;i73ꙓ醨ʝfF\і6g)ή7'b >e#}ljF'JQnim[=J화c൨b#_O qX5o?+H)iT#9"3tpEVc3'#{_94UO3\,RsRju*rZ Tpb"*-"#s U%2K4!&R!8rYJ29qw; a*j&-K {@}]2/IB-#; OKmLM|gQ`4LGAL*" F~QG*T.I'$_"(]Q L< :Y^DQrRKkILV3簣Ӗ-%#?#sڶOO6IٶI҃龷v&tN`h}|(v\4͘>rq@Meu<[vm)**`)0mbR84NČJ.QIch1m 9+Iڔ%)TmRvA$<`2HϹVU9J5,3BW ]O+ʳf|ݤ6aIjgfv8?!G)([5ڂ؈LhxVYȋ\xZ溳VfE=!) lJѨPB¡rSflوZF05Ԯ;Di{xh.v8V }6F i90'ޯ*3G)W3qF&<g?>Of@ُ!F0[ccnD$ 6#> Ms\_҅4GΤ9iK45y.7t,gȝY1I"PD)A&UK-zCЧ9xw.d xқQeY#Xj"nIu"Λwd/#,uW{K-^1vJmꇻe ɀNYf9kYͦ׳i\B+78t^y: .JnQ>7t.Qs -(+?.~LOݾIXlFd.C]~n)k"Jﰬr/<Η&̞̺Zۘy"wh)ti`iEkğC_aY3~7jd@*xĬX۷nU">.?v%k]I')`ɫӁAӇ>%W}YGV]O״٘\XL|5.n( tC/yT*`8f[!lrjޏj/|ŞGԼVr{[ݿ-3JpE*f)'،h0|eGRf̧${=\ZjGڞiM[lrkoqEMf2#l@,-i~ #ʆ+VQ UԽkgX|8Ż=XזƞhY`2VDBM"BR#kyCeXP"D%mT 6$Y&pR pFDn3Y3s⪙siLǿ=6bunz qɛ5k!A57-k3XmΥ$ ĹOdQaYgJ}4Y)=uӤ]I'0Ș;20Itm:1))O9pHɑ0 !p0EI)c s؉Wkx7+ 3pkN?^+t@퐦w2{vѮE+UgۡL75KP-v}&*Qvm[|O]7V$ ɉ$EĶq>(B(!8G>cz,Ti=w-jקve%1!KS$gD>ཌ&蚕2# _wfB,I E:I( }ɱT;M9GJv2w\, H?T;>D{&lʭU7q:#(;lpu QfJ,"G9cYVw>aytEX44"cSN`w2fL,)dmRȶv>T5sNur6\,]rT*bVh%*ϴK'y"wpREX*nE>k^>E͸pCe&mE6"%]Lں lDtIQ + I'?&:ș XMrѱ#4C;Dgx{9-c(6R7DVd(BP &J4DQACo[ ЛЛv #xnXV18<"Ƣe>-6 $YdIOduJqrjxdRg  L\џX{/jO5ah䐨"( 2YCIG m^Q ' sC:R,'=1 qdPBgvO2cN{ϭTσ{?F!&x3\‹23dւKf5ɪ~Yer:2rʢ^`"+EǤQdm"5&,Ujg_YQǿt+# ,E!0v.;ԔK)غ 3oL bPŻϥ/~Zh ɻBnH.+"s" F=ժFKݟ:y[g "Kk&+!Z ZCLBA*gfR0IB?R8fx{7dV>ѵɬGP 0ͤCr0aiȘQyL lꍈmwSlB j1U|Z%{z.'`묮U=臻"M[dOEiDI@Y&$&ϔlJJ6YKyK 1+.vql|EՋy~I)Ջ_쇻Lb@o3ؓDbF3ײO~YZx\[2$YV~*`$m欲6cūy6ˋ?.;_J % 6]L(]v3?{Ƒ搑]EyⲾ`_K)ao #QPl:(M{US]U/bYlbc:nh2e.+^֗o?EQwޛT%eohTZ9;x8iaGŋ?i?sk[25IX'lHޣ^79G!RJTm)>t/PX)XPqJB"gQ?&WLٚZiR|O/ٰDsi^;?v[Om Q㇍D|Jk}2)KrθfVD04U",Fhƕܪ)h!l1"uF TO&i b` ͪ۠7ew5~=wz~'Q/*݁w]~>SKǠ-b $AU,XPJ4K#xŹHqOV69O3O/Q4J+F4 T-8h@ FNACakB(GU^dAշGIq>xR>< G5~5KkgiŤ'iYٴ?xw5XI6Civ0?S?|ݳj+%I/K[ZK?בyf8a?cEޯG+J7Y?ϟ$)-EdtQRC pi+@btPZZDTͷAϳ2'/ L C|! 78=e788/}RpKCQ0Qm鼾pĈ.1C?ѫI?2@^=m4k%Z< LKa(+[Da[_Weֵ1Xu$( 8bF;'QQRi MĦo1W1使6SK"Z缯;zv$X_CIXG,^a@ #.6=6OZWM~9`yohX\{V5-ELupՑ&mY 4y pVrs}|U[Ճ2qa g-qxTA> +e*zKZf<ey:¶?J2/_p}rֿ-G)UUy}#?It9y3>M[,߶ {52jl'?FWOJP٪;W}.N柷0l؟NLS- 3Apxgj`50g0ո~ 6[LҼHzse*V𞰝{eRɼ}6ڛ.;:gV.!GϮϨC &OBTH<(@C39Rkij!%6jEfGM'aqu@vb'V f,.ΝR*$j$A[)::Qz}:/*AQgc#R]d`z,ƒU /It՗67hEr4,_ kzt|L$Q'"W9a-phXB#^Nj7Rqhؓ0q,)Hh(0ǤGmi&b{xkB|Ypڿ#f F Vmh< (N:7aW9R'N|)4ubXH%XcJ!H3tJHadԜE%N^3(# * 5iK2^(&wu8| ÀB9/Cá7_>4o(5&V@@hDXC κVx4y e(}"|6zE}ng ŮY9Ѿy$F /pK |?y'~X84\2$KA Z82A>MrnMzp>ӒJte=-?1:˱iD+(TrMBqI‚ʸ0{  Gqdg}i:;ɾi}>v6 y8 G ;m@<5hpG ƪy lDBD0,y BL8(Uȅ5Brd<3)ZkKEJt]3r8ǣ4\=Y<>bST-vѲ|$Rs+LL$WSx*2ZD^)3\ⵜ@Ӗe= q8+1'#ڔ2W(J$/O_jC}W;<3kLrtNIR惰 zC*,whQhP[SI)r1AY \@}$!z $ F銜?MϔU^я."N3%}ACfc47ob]o(EW'^Zv,$Cazb eBs, D9ϥaRk#e.4>5/Zq. Ihภ$e3rt݄ GI9IIIK XKRDtVB$R0JeJhJ84wwl+F!n8H|$Vj<5zJutsIxaF'`1$ցSʝbrq2vH2ƀྒ&H"|@1#H2&DI4ϑ<څMpa¾tta? #k.9|;@:QݼBn̾oZP\oJvp_?_e} m6Ֆb1?.Ⱈ}-}@b(ֿ\iAugzv80lIi~mƿ :Nl ZAsFvf{ُ5THzhil-但pRR)IN[Ut 6ZbjOQ8VlO/weIxDt,;AD&91aN -W #'&˰Fnϙ71HkE੉u`a  $#1(:~9[$!U gW nZzݰ!}!]oa01Wp.v_:*9G«-#V<]PJ"+s VDs:}tp@Q&1I"33)?P+u! DE\l6YMr ڄ4Vgc. QbBs?@-EI"3y A;gyNMr-\_7itOAڥ#"lK=bM5Hͼ;&E&(BQ/}RW|%!fo5yure}W|o%-d'XO OqOŷ&B3@/Oaz?*.0|'5\:ApJBuP{?8gEv5ը #|&9KP@vF 儝\sbrb'xO&ST'q弭4h&Y\<8j=!H@9=A3j3KX>y:7pɧT-Ǒ 1WC:}?WR"c{޼[zJJ9͉. Ӡ$YLmp#,aTX=EϺb~*0I jCJ\4_ka+Αb8c6^ -w!m#>6 =YٿOoZ2r$Fm+PPKNd/~ld$,}}z6(YmZj7o_nvj-*uga\:z͋&|՛_~7_} 7{WG"[MhiYҨvo0KVJ%JJ CY̸`?X'pƥ #! Kn{iy٦ץzK[m.{Z5B|G=كrirG_`,RY}wu|Qy6 zd0nTqi\]}FWDMo^I?n+c$S (8?iنbjdx"e2#5c6T0JoS =A^wqI0b\2r0y"C'֨IeOgg'dߵI3tquA]E;] :;n[5r5tvɥ uxANύ\_JF-wxnTF=t[?ŲW~ѻŅ,kTX!Ɂğ8Gg&^cqfgXh U [*ZCEeLL\KJ a"ﵠS`Z׊j}C QTxԮUtX9ȗ >([{F; m[_ܧ.~9\ ܣ@on |̈y1^"n/S^"r.N7*CW#ZPFS/F]侘m৮8d~= |rj:mƠņI`!l)Gߴ2j+'^~ftyO:4j[- n7r7߬e<1_O? I\y_JN(ols®[5ˑ?$ X.ZYb.MJtAgf/^;o}G7I9,o.߽Nxh$L*`WyJcOR *bwףafe@E!jLٯ{CpXW(?#ƛ"xψ=\=<~ΡA=o~"2]5ҡq>ȽC)% CN gz-qj'i;qvNٔKkxM+Gg$r*sFr)Ī1 B9qKZhJyC;v[{Ko|Xg1l`5la!#L巚a;v#8#AjZ^`]rxcHp/d"rZ`6hC gU#e3ʭ܈ЈDF3]TO(jצdP35rM&,"J*PW00'fAFfb3sMUD1Tm+Ts6E}wv eO*{J9g8u/hxCDl WUTTsBʮVQ—bɘ$I~/%* -^$®ߝn'Sta"$XgLeG9v#nAw/n(~ɧ4';62GGOyilհ3^f$*dT)r*+ZA'-:C:: 5nME A /#a.uj)uևȹ[ D8=hcD4UF5m J`N *@1B_K#W l lȐ (ȥ/mv ؘXzvYɔ {:XRj##4VY#v#n=˫΢ÅpM)ٍKvՋ^Գ^Q`oR"DS7R0yg}uJvI0*$R(֑va֋qTM> L}ŧmV' >6fwLMYQ [$ُϔhZ#g?R~=樼N^4z:di Em79+Msܩ#9tM&qGZ9Rq"J qNVPI DkH(YSԙJ;Őb!BmɗPP"zSiBlbe9^[z®v}K[4ӿ)R{?.]Ɇp>:eUEɁVf,Ӳ2Pͭ<}χ8};M?mZ HxP*l&l=UOWeϖ}E6T%-Ofř@kYT֖KCWDn$mq,;NNR5n"nGUjCl A￝=,޼<Σ^9dpx7?qA9s3WxX9[ܖꞷҼnw,4tò)4#v_Wn;e9O$cR,cqVr{hK %S5%PDkѺ5۾8&[ e *p3qB`|m вFR(L9׍-KOs*Y_ jV'?rzPƠ~4V~$.Uՙ &)6 -RNr87oIwUuj}!U&b4Qq"HxH!kL qSP9w uxh^+&|JkIwԣ_@ӣ_UEWL/m 2YFn3aW0o:x:_w`X}p,_khMdv)!SCnb}LPܾ6F lX 'Lr(,XOWzIw#gKyM'䔪;¼l~>q\iY:&ذa]. |}tINQdوt`] * .hn͔TaPRձ oЁĐ2*> FTJ3U SպDHŕ@b|nW@7<,>YX$,Z1]s ZXm }Rm4߼TJCYXCX7ڈ VZh&+1QqJ[1g)WcU)r2գXXʄ 9`adT A 0擜U&dB9չIZ5I~Nݑ<fCt3s`sӿy&j:ޘT=bl8^\\9vz3 9fZB^U(9ZHBX&:c%ɪȴOV}}ۥC\',X11:d$AJ^V٢O)!}9Br <Mhݙ\?jF^a((^UBt) ыi7v /AxRy):Y9͒MrWHOg9ۯ҆_Y'_CHUDL26C_*E[CkVvjX6DŽY#ޠL rQY_AaEgr֍-l^znD)X\BV1icȑCT!RUuKNdB(|"k E~ng zǙR,1kbKP1|ZSk&%"B0K *gSb=ߦEM7>4AA+GB!# U `UJűlfu㉩mf=L+kbú FʋAthw I9oĜ%PNNGEGhWSUО,KglR(qȎҡ;xY]kn=K:\{HNS.}(#PV s3%G :g?vH@։ 4j[%;w+J;Rlv~\\H÷{?~VI{ȧoZ}|v6=?5WiS{^3#Î"`|&F'`A3E|W=×dDSCADi]_UWWB|3 6yr DB~UB:e.:<:K)zל &/rORmO: xP/*p@wiQB#s"RB:oey%IHM _`HQ~q2en8OZ˙VsBU8k6c%%=IO/o'_52ar4~F^<1-eћ8ߎ~廯zVZa%$"TZeqKڵ' ;XzDZۇQt6Ϥ"4L.*_j@6pfP -]d?ߣixUAﯓ*A'L ]|!>9iᄝ"җA:(g4Ȝ%cZ:ev f}}^u9yc)\\"x>fĝ4[g a) e`KCHQ1Lz+cK̲6w~#)(* 2 '[tHvgx.ZxdFFTC)[tMA{1yotR89N<|E*ӎE1.vn*tCΕuWYQiU}C^74m,^ {]=j7ELy?ZoieCsh['ś2{jΗ5̏ڪ$א)(1.t2VAU7UlT5Fvk)r%iL?en*Wb\7K/UfI7?o˶ٰ?왦[f|$̕u;Hn=t.sV۬ yF߅~mw<ʰL؎y5R/yՖU|izg"ڰ5t]m)_P2v3LxPJ"g$-&Өa40FҬbڞgOKY-簣h{8rVxe84`w)pR9Pc g\ʄN" @UD)袭$!ġ67O|zš.["7J6̥ݪ_u hupbo5~g:'ab6b7y,uGj4/Iޣ\s8?}Aq[p1ypy*XVRߥ -[NT MBp9B ` {m):Ei`3+7n> O~\vpףwiޟr MS_,J߼f}7'OʰRq m9.:E R<$`I%CTi"M.)W&7K'|nE-5,8 ԈUŤq= _Jy6Mh4VF!R ֘Rx-R5h6h4DI3kM{3,_(aq ОGpH[`t.:ks~ApH тf (br2_Q:Dfq-Fe!醜n#p-d@jwLó[" -nEj 5¡:0j;tHH %ѵI9|?;Fr*(Z^:l(a C:p PԢQd)Wޛi&ùG c D9Q!d訄.q:9$M^21o+;GoB3M44׭y8g/Zܭ\C״ oI[k߁s#2x>Oމ?t  !LRt:fv=D\Ǣ}O'Fgw7 #қh9JsIUI(n" BXPWu;pG^Niy^7u[Md1xgNN62O *0g*r?!&: !_k7ɥ*3CJ6raM魱\.L|+n'm'l9x# h4MM,ߟ j)*,I4uYވU+(" .Ŋ5(8OMRs|bl>".b X 6@u dârgF9WcI${bF^s\ R@8I'!pw0 q+P|ּE&́:s#J"P`L O&S&W)S̡vTv;q"{l;B߯<ߖHY`m\pFx]r]dJ!ئsϥ@N;򹶐%O{T 1SMٟ{zZ87]3ݜܟc\h6Ֆb1eqXL\FPX=*A ~42|;}6δu>6`r`\_kf=ow\BXh]AxQ 95֑V%PО'q57ޖr^C JZd`R$K* :-ړ '~eۓsN!>JT `Frsrbrh z=|7G*=xj"D2XpAX!(+H [#g <\>d>d37zڢ׷a|Vp[ҹvwܝpw7>^娒ɯ{|ϖ=8W*̞OZI(J΋5 t^̛Q&1I"33)?P+u! DE\46YMPUgf;Ki9<˝!z n:f5RQ}F=Z0ӊG$WVM.H\MG m=T2 P2:yE-2BT\f(EZp;{>kGNDq6䱙uOdµΣIq+b`LQb8k.TF[ 㤱:s')%[Ϗ[5Rn;P!.8Cs$'JK"3LwHHEGi7}G DbH>"¤A(O*9=-rx>dӌmZ\Z+ΒOSBcxq)bhs3vp&bnɥ+#vA-ܬ?qRԧq.G}3n]TA0J('!#; 7:LQ^wOjG'$: (''Fm&q Kr[ߟÑL>d9ߏl8B*xU\JT%i{k=y %r0#U)lrQ]SGL>} F1Ok}.ٴV&7/fcbPR$A[]3ٴrfxo98|.ijI7$.apY_,oiC*#G1j1ГUΡ*#dSM} j&ي<|}z6(Yl#Zj3_?΍P5K]YX.|P~W_o^ϯN_|2s_ywL.CM$7 ?ݏs>jۗ?l4N6kilo47bURf}پ_nq~R00|՟Y|в5볟@pO.+H6wmmz 6#8 vN/6ڌ)e'zxՅEġ42$Yr꺜M"^ޢTiYEx.GV%Wܽ)'P+"ְﴏHqXn#1킱dhJW]BjL` 1"lv6D^Us_mt=A\n'H6>A*C2A֚4 2ddPݙN#g:ty]P?Cc{⊟e`K{oċOA|r޴6';##y1y"UXՅ&lv }SVJU]^Mt;h,J8dB $.y]7t{]޽{; ^vX?{BvjGJGBxW]?oMC%ӫR ʁb+h;HL˓t Vh2' ,9[($2|)cyNe$RBt)Rt&uVT$V Yxo[mMl]o$+t,\qU=uG1vwMdwivWG1w1Q1%j`!T $r9R5P+Y۫*tU_b@4rT?%lK{Wt+W0ɒxEy<ʂW`!~ڧrfqO7ا(;߲k*ݓmn 9) W+Lac&-:#묐̾XgZ m Vu >+"KqUU/P vqU\jщ# +"<\iJ\rao|߅ZzqUĮcKW)n*7⪐k很Bcn/JL'^Bf٧~D#U!ԾBi}?B%vG\ᚏg Fcq r91zF\=J% ;qךbp^W^u:47r-bMAjQD.\8=2s9?P:KfBJX,DDԕ+ƒ³hEm#yHmϵdŶ嘍i:EqӛY>%֧pzA^1}9L0үh"j1Y gz`22rcY@0ܧ Ii>+ܴ'_|&wUg}tY8^oݾ=e"0Y8i=]"]m*bRNKS}mN{tЙ~ mlI314C Ma([l|C mF֋>onP4:gQj*4MQC'n?]ݥmusORV9eY.n޶f[*:ͻMrgKn,prhG+^o E/E-JVU[Br]☰kBXCMQ5-JQ5]۳f %ke!*rl=*'\RXI=:Cn4k6)Bq Ai+!rcJ234h(xYs4|_dӺoϧ̹p91[D/pUHAȈJȤpB̌03;k]I"E>Yp-hk^vgPW客((pR(e|5tr|kvG[7S[θyuՍV7{uŎ {VxYvHk7Hs)ܣB=\)ľD]jA=Pimu.9Tu8E?=z\.WN^6ʗ2ZBFE4FߣFHlOTF.S M-ȝ#DM%Z0\;ӝt7w}S^w >F\V:%z< k\*3pYK85]S˭wEjӻʹwkuL{]&mN(VqACD^$3"(NBG)r QcV!$)zD0NZāv KdDrQBF1RIhBؚA@XI`DPg mRarIL  "2Z4*9:Ś/{ue}uBo4|[5Y=G:[5X9f0˛-1mLZ> U,Z|8:[bYII[GQg\5ꪹ"GIrTfߟ_ǩw{ElC;M'i+niO:Y8 Gڎ#qtߔo__xS.?ON4h.S('ƫH@5 ݎ7u-ƛ -Zgh䬧 Ƹ59`b9s[3{?0~OUOF%ڡ>- Iq"g!j5U_J%b-Bx_ s9* viQeng6F8s*[.)@ ߐs<{q=<|*zՀjM@U^0 :1^^ oh2 F7LQ&3?TPjFcT)#U,ʇri&NY9L >l5Um@v{$ -l0:(YPg# $ҩ[Re'^FP%R\ѠT@-LF B:6 d#g B58k"}_2ԎH\ø,(uUYA!6sha% "͑IR:ڦ-c@םC`.Jt217*KbT&ey,-9N b$)48A!{¡%:\5+eZ$tp͕8KӮT{r[m&9$IwX>A6( u' XH<ːav%:,hI)'1bƢCiyl6Z 3Hc|Tm6_7DX3Y(E銕HajzPb_ʷA=G0kETusKhor$LjS$8U۳1㜌uivq_ vHV;uTƮOeσ8^7&ݪsHM'D `[&=]^'oΛPн 3ّA&(ӐظnɌCz 3,$IP;kYg t 2|4@CLk>Kgܱx hlR&=@)].~SS,lߗi\c ?PGz1֞SFd49Ni6 Yvvjx|bJ\|?1_"ݻf`8`wW>wwVݤsԜd[J&^z;uEs8\2\q#г"Pi_WdS߻NCWMmrϧk.Eņ]pjy޸C j}wDd|X!WߏY~NB~}j}E= Ӌ;%yjNidj5ɦiR^w"vD&! n]K1:)o]c4:(Fh AʽQ$0W0KSr){hm_}5v>E}*s$t-ίσ-x=4[|FpSJmN}N'|`ü8ӭEb.lNK: 1jm#"sO~zŁ%n$d (PZRTP!E)%2,\nJ YSih !:1 @RspE%Ac0İ8;^>-Ms`c{"L'٠da2~خ$L)[ 5b0ؘa$ Y+k虼( n#?Z',x{*zBL IZTMC#u4 %49ƚL,"e0bHT{~]Ī b'xۂ"~-h4 v[% FF-C -`j=+d@P:;ٖ9&dJɧTNgt!;GJi*-.Ƹ(\pZ- vlYH "efNBh%Պ x/xlu 2=@y@`'tȝRjk=:dXQ; яɤ~):m(D+u[AIF2l CW~9vM-ei易a͕68 VgԺ@.btt+F!srC㗇9ʣ^`H9{ZFM.AA*iG$}`adeCܑr՛Sz\L.ġK*Y}7O{RVX[<ώlt1EβZ@UZ9GPiA5N׃snN;aԲ+r`99G+rIDZIf2g z3ZMKF:MGNV5X0˕0?Jأ={D)ek,W&D'o\)2EOIk1k)z!c5 [)ٴԨo0<:nȭ;nݏosyYo;Cnqn4|x{竜;|a>iy.Ƽ˚)z:uˤ Ҩojy4_75{rj$;p="j߇viJmXBg|T3)Θ'Fvht"y>!9-հj[N:e[-fȀ XA4%Sx*6*UW"kX$$‚8PRe ![kmvHL.cz{mŵȃ~{[sY=10g:Wv5,V8GbGNqC9jw6؀Q1_;/[JBER(Q0y G9z\rɺqDodLeҋP|Z5ʗV#hrGEdT!3(he *,IH(' :]3q#y=HlÆ[ؑmΦ=m8On9B<\)xR  $#Pyi!bXL/Fxak/DyW84NÆ֗lو+,s Go+Ybz6b\_Z9 +=CNZ5aq{nXwL~/)&yBHhXkr44e=D)T25Mi-R$}6 <И?6gtc'׏#y$[l%}y~ ]Nӓw6C> _y0(-i?m:$d4̪m)GF RP"i@O$J$OBR&YT%ֹ*]Y[eS-{OV↮)D|R B7gWc`gˉEsEC'iE'a~Hw:k Ʉ[`ק4f̧ jcO̔Z_l6eD7]˷2Q ~ͽQz51Ǜm,vA:ksHXyS<n"{!KE1E^ѦBA{T!:RH2bGJ,f@{AWxSHfYJ$˅$i+r҂BDLB4hLc\G#=[0Ov8a:{[;s% W_0A'Ox<"(}9LD%9!<ˤĩ6"Y=`Bl-Qy3ehOt:dRM h:)LLdCOO0YW 鬬PU5T%ͩ/)pў<),ďI?.~8Ҹn fUik[d2k(hJgiu!+cT2_W HX~\ -[e>КeޕEB¥5oh_lUʖN+OϦ"PהqKS7B5 Bb}mMWeJ!+MSgt Am( GV%n2ëMj6(~HFX1^0VKu46,a+==0'7@u<-|Ap")\GH}T4/!w^ukD:TIQQdō\G֕/w0#+:RLX?4za& vOd|dox1cBXIzӏSX[n<2K5oP`F ͔, 6?sC)eݽDdziݖlL{ⵍfڇ 4`@)fO2Nc5ʹRHax;:ܻs&>*E1*:f0=0^1c'BXhgT^$ſ5κĽU I ;Ukᅭ|wXʯmܩRʾߍݿkCVR& 1N'C"X,BÍQmQ'(߆$.g.A߈i0Ըs& aNH)!Lu_f-kxc$aopQS?TF1 WwW0Uy˻3)Ɣ=7 :>? A>",ޕ -ү3y4R]eWjb.nДXWW,o 0-'~ѰH J;<8䖨kS16̀daŘ.be*9ĥН~f1!|&~YTL{66o$O /t >2 ~ɆPۏx)f{>| Bι`1X>="R̞ STlג)E 0[)b.c&nTLĥdZZA80LsUpI4W_҆sbX1W].\svo\))ed_v&bTJ\λzJSA*1.8?1W[r{Wλzh FT.P\ưD'޻ڍ]IՎ*|sZwEPssUۙ^&wSQUff4Nb  RiyyBἩ+c7/o5 w2.{\rcY_¯Y_e u<܁WO ^-[Uik'2#3?{on 3Y6Oɿ{I8@_#u2M׫}tOO qiKV*Y.7\YdMb+eЂ\rPBV&lARL25eޫ\]U3Q'l͘<*SFa ilY-G'AtۊVWTOS 5(zd6%/v)Yv͉B {kzQi|x d;C`ヲ,2gHAÃȭN!wqPI^do18;֢S~'' 0|%S0g8h0qIaAG& 3fH~UNN8y'#~$ܛ[}yɾ -e;}yOVGAcE4J3 jU^(!`bP kU@Qb-uZ7 ٥h18>ge}x6w on"+1AEͩjN$Q(GNPq&`XcxɽLO,fyRd ǁiqjzV8ajtLz$06݌'a70SzPX=՛s|U 'dZ䔌ǿ5e'ݛ{Fe?AWXh]hB]Fԧ`^~=C4i4w +CCPE= qae d 7&eWCo,t 4 dXO D7hmGOك*'nxh|Ms@u<-|A3R:Bz~J2D4"RiE*((=Rܨun _K#+:RL.[̨ Kykdž 7ѶH\;sBx>z{Ao2>2O71xGf ;s7pT2^fp`={ҩl,)J U7%77}oUwmbpoŴK1QD y@0 촻Ų'&vmݵYc(+@n6/ |ZUxjnDh:vF6x%pq>s :qjǩ+%umr!F0+i[yE  ilt[JrYP,WH'((,XtLׇ4t>o7$x%ȧ&ClO%Vq$\@KeV8e*"SH)$(0=jU΋˽|̬{_F^r]RlvurvuS6ކ?LkϾ܇A6ҒV|,[6 wW/`ePx׆bzr~[>ѝ;t3Zpymza]:z}v։<-jBs Dyz:ߖYHfsG#69/;fmR}>Ǭ{&Y1 #&8=#BϨaX^@ATnH+,쎎;:YnuvضVLEN&bE ؂{'D)A`EKG 4*8Ԁ /۫;:R &|vE$s2JwZɴp0AP4@ia_U˓Q ]mo9+}Y,m`qev&0b1FL_%%dŦ 6Y]U|j0R<uM#PVJoZ*X I%%:4^GF 1 `ΤSL ].QJ#AQ9֐skFq xOF,Sj6CQ4XlQf eL;255S95k4?t6u"RЇ|?ËZnM%],o'l&xQwi^L=)?4Yq>#L5 AH39IY^e?7Ud1eQs>˺@&dBjN[bZj<%٧;n}l0)kUdD!(E&PpuJ({q'oS<ӆGUh~ֳKm3I;NO;iz xqxv|rxN<)7yi@`WLҿyx]¨ܛɵkZ=Y^Skcyuw^]f_ P aoyM߾[[.dt~'_]Blm 3oFmF+u6|'+.٧͍^^9=8zΪY7mmdoHذequ%MeRơ<*9_X2e`'XWˣ{}u]^'XZ|}<>,z}J}>JalpہF6ΖrPYu`u!ԹTq)?< V3 V3lk5Yk0]AH’Z5kՖT4ct;V o:R g[LH|b`:| LqOΒuHH!!D:,\ڤ1 !)|A{!MHY>gG.mci&ȳ1LcYYȒ0&E(RJ{T4gh,a,LTLO3}6ahԼń"QGj+]@W=\ Iy$ Թи ]Vӳk C L] #xee6@F7"O^ETҊGCM!J2&z(6BiG]V4DZO֤Pxuex<٬@i.թƺ1$Fk x"(sRd %ث8cC&/B[k2 A4KElxQ?1Yxy(S4>X'%16rYnǨn1|vMTlAy90IW4Z[VIi5S_hl|b=6ZSBשd F侗>fɉ{" G{6!<.XOCfrG됢u<몮C/OWݝt~@|JןҭOyt=(䑊M|`mh>W^ΛPLm BAv$t"tɤe{>F4R=n䭊tY^SKd A^ج3@ t 2|0@C +/%|Kުqπ9.7kҵ߹NL9~[}?÷Yw׎y}ۮ5PqԃAMYնeg|_贃ܼk30Ny;"uN,auuW\Ti2CF#b kb ߼Qub9K@C>0i~y?BC)߭»%k-|pƸ%wSҝ%5Má<a99>?]̫7:OwٽWj圍i)URSBCA d#y~;ʴ0 XA,lT4(4XE+&cdT &Dr,Y:HdMQ^x( )cL)H+R,cI1Ld10l&Ξ\;]8;mԿݩyBH֪KF,؄A^2ȱ`0;ؒI(_A[3bw6IP*gTo*qJ$Hl$Ni`*j3q(x{ڣxt{F`4ӓ;`ՁՁ)O?BkDY'z"ԭE|pS {g_,7}`7*!nS>֜C> 3D#C&@V6HLJZ S5ɶ-]ɹ$~}.aOr IHٗQ( [{faff6.ּkn{ 굝Wde`%͖o.f=+I3dK]  #6j1P&Yk3xQQdt[6Ru 5؋Y ҪMmZHW]N"ȩtۈVbCʹ^[k[CY;xTkF6X=xB JvAZRMa1E02Q4jrk2@G}S.& 䠚\l8>ڹh>GlwmZ6z#MV#ޖbbN*$ *mL'u12*} z KV:MT1N/ݿa+q`}Ge6!:i]jGK#(aUA!c#A?-ۀѶLP`f$V6Y7ŇCʹc(P.:ma<~܋f܊m2q"j1!6oP(V8ꤷKbd2eir*s/`(K2nj-W  6u%A^\09ji< SF79~}"A [eotJtUϏHF-KrOꪷ3/5~3y}ރdʋN+Wn)n +O,TzX/ӘIUKL<t1ikf9X_wx᚜&?'0I%oKtO.[: ֩\؇A͚W+|ÞC7 >Pmk:wl:S,EkD}_,ߴBնnĘnP.[jK->̻+a@*8`dèuʝ]lQ-Uo/l[WE/V$Ѿ>mѭ*yЗ.kϡEBm` XÚCQ{6ԈmvC>gwwϷseЋsny7̎dvb˓oz13wbc<׼Uu<׃2jǟWzM܅jn.stt7tIg+p}f̴=V9IGfBns LgiO>Ta1Psm_!.nXcd?ŋb2kg=!)g|ñkש>Ӻ~l_ nņL*@&:u(gI%9MK R${Gy|Ӟ: 1Tc/C“9NjM)Hl4]TW^,%[:P. #Ө6-}*56]\Фl*X5z]/fcl"y`orCuy gs s;z&W?AfsrxcP_:O~us*Ϸny-2U~,6m[0b~Uc F4|êӍk'r"!rieБ&*$]j}79wmu~w'l/3s؉)P;=0vn`$AQmm` Üe5ej;P[*m#z7ua? ,W;]uC< œ}*[?~]_;f) 2So?DnWM(򗭍g&0E9Y|W׍0[\^}{? *}\JeD NHږof4Mj!Qժͯk^/n~>«⇝q/tjO?|_҉ʏ.L-KhN'2Z8IeD>;;3 /rFa_ZK *?6qm@lf,Rk]XڒSb.̸:B\yiU< 6\\ͪxVq$dqru+:t+kl/b/m  r+Ȧ'?v5J~p\jZ'ap5NX2(GJθzjKv|fpnpj:quRN؝Iw+D/ZbJ;|Wڑ XսվvD v⊄PFL!>TSR'^Y 2RQnzIC=trvui%KDWmDW7墝ei'/PmE SZ6UՒ( -*Et^R,8P71JIVrj%ykLG`']7b+Vkq*qu^~y+р3\AmT1y#&{`π3˵$M>d~1 _ɞX rPժ0u\J3]#BTGbw++V'UθpD-Q>&uzqj8S{rNqԦ <97Fd˵\ATvbD3W*x]G`D7bՒ:X3W$>Dirk[(y0'D`\=,^0j`JLӟ䄟KJvŧx˨h9K=Ѡu+ ̏MؤQI;ی9FsG#(#P,R7c߬POŌSԜabw+Vkf~cPvII6t+6T5\ *WG+/$Xwt%W XRSstu ;: 6nfY+VP_#^%sq5N.<T~oDvFuDiBvCi+U/f&pJʣ4Ygďҏ|eU.(JI=xk;,U?n|-5r꾖UZ=#V}= @ C'5NStfʑ2Jt+lVrY|j?*2d;e\ HQBOzZUav7zO|Q n,Z0lG糌-^Mvr-Z7:V空>vJ̷f J׋sAjE[W6pëNW 6y} 5fkPku[%Xs(?^] ]%^}OuwZ_O[?7]z'>Ym~6n{j?lXf[juo~}Ox7vPqf9: @^Ẇ 2i8Ӭ_|B|?/S>ИBЈ{(.#/=m^( M]eZl~_o~jTQke|z]E%iSTeVfkQ,&|;ݜ__o7># 0Hu亖- _gb.gr/U-,D͖@շ䠛H!R(&i$ X\O/b"%$'Q2fE, Iq1rѪi\X:QlO)N^.ŧ+i!5ms)j!\ B;ѱ&D.ICd9*D P>պZcPO^Z ZIAUP\[A`uIAZC4֐۷wl5crIJ[׌R8* HTGTI B@ii{,b :3)e%1f3ԍNYV":)C)Fa]q{,Z˜)_9Z,A됔2Pv lK&aM:JT @L)W44zѱ]ˡ e& 466 .XX!@!Z "6^\ocYxc*lC'P2hOBU_x\& WeDM*1ڜ1$GYڢEkAޙgHx9HP1Dk09>ǯ#>uD,ڀ(> 2 Vhr/ 2%IkSm'M!(A${Y̨CR [AioeM1j3,HX .֬)(`J͕ Rl$2SD4.# zH7dK.xH4v+ QTdEl8gb-SCۆXu8FQ <"Qj1`&[*q3Ƞ,U̍8FjBӐXtUȺWpą%jH,ں8R>SIS\gFV4 2a )f-a)-%*CQʰ74xkAEՂRUCbw>6f\,U$(CA6H*\W  XGKdPBFtB[N2E|#2g2(Vj>5FHaWF IoF y9CqEd4 aH >j ȂDmHC#i5KdY&ysS ykJxAŢς#v Մ#|\ DUT!a :3X% @bi<MZ/d7:%\pn 7(Y L^k)NoPSJHHf]W2ZêQF]J2֗5yǖ4z^>wuBH/%i `h`8Hڧ؈^V+` A'۝VAk"?Z6hp <[gfyrnv[ v) d@RFA;H-2r,29yCaT@Tep!Hƺt",U҈lls"M[Bz;YjLE֨df) TZ[f=+۝`Zzwt $Lz ]$/mƷI'y*C:khL b1w^yVSNQS>'at۴tR6kjft @][npDL[5Sn[ ,E 7QGum IԽ<m HjEr3c\ppGC{7?h |殮 wäDx":39򡄸< %\#^.1eBv )0HHP# wB!Rn Bv z &iBFz+1\ƒ yrٙ)olg +C0\ة5-J1<@Jȣj2jл5: e{ 4ІEɈ1)fndjwVw 5rz=DpۈH>9.J3)%LTH#L5%i\4Z`;k$?ks+["4PCxYkʾw8ȴAû2i 6gfC: ZV2#С/ڀD\DbiDL ;`3Gl砹]t(d,kJ8c$ /a Z6ِCL\[-:_!b] 12 UU bJ8 IȰbC;ܸf KklB:KLר"a!HTvƂ70&o`*/&7Ne)`W`&X4X_ƒbqv]r WJBrZ)z\?~lhbb )viJW3 H;Vt(b:};Xvl-4\&f 0 B̺|. {=_jYjgb}ǮŠa^Zܡ$F*T>H yo@@T屽@BĄ$%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ Rd@r 7J P2(*Zi] TQjz'I*^@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H tJ T}RuQ*%zuJ ='%I*zR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)NX dz'ܦQZ%( @HU#%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ Rt@ߖ[]aWZjZ{ݕחUAߩ<_jxIolKN1#\\{#\J/\ʭnp.}5;UkD_誢_gp(=,OGlizDWuw^Ue_誢u骢Z#`CW.}VcRlO7*`ߟdp%UEˏ>('aJqNF^5cOLW}J4tJwlѕރ4}Z#`%loU/tUZ~tUQnDWCWUC7tU:];]UF] ])ݖߪmJV@j^-\rSբ筲$n|U&L29w?W/~>MS3F7\4\ *7q->pW?β>%ymCӀk MiZqƎADӏCXs|;^fݰacgZR=!׫I>ÍZ4w8? *NuU:›tX(m3GRit}C} jqIMW@_2k.<{i9KN^U@~P+:ܰՖ^O|V(rU&9'.>#V\|mk+\okVq쾶TkOӂ1ʥEW ] (0DW'HW{{ 7++8/tRv>Pj&NPW`ldoµ*Zg*JN|}6Y XЪ$qq^2t`~pUs]CLk+Ctuߡke{DW`n U]2 (DW'HW,ZqҰ^ ( ҕM=+VW[Zc>J ҕbm=DȤ(abUfw^pn)VB #`k]ohp}i;M+%~M4}B4,ɥ(h 8c"=;μÞ7ߕ irO1у_達4 kfoY# 4><}hs&J/܉<˺0f9>Q6)7y 5L &{qۘnr:p6*1D}ӊ%-AI4wld=O*&,7X!f3g#UGYVEMo7I;:ZRrYQ0#UU*\ӛ^вc{A գBǝN Z֛誢=5R(+[x{DWS?B|?裫Rz+a}am֮W6+JCP?];=cڛ#Pɟ\˟O]RAWC#=x} ]U*ZI+acLGW ]U*Z%*JNʺ[֟dµ+uR;]UJ1!=^np3ˑX!NV =՗;Vqۥ=Xd0W 2i??-_qu$XOBu1MWՠ|eKu )_7ͤ٨ϭRZht,KYU,.z 1x/s1my69 ;^O52&"0/M.ϧyXIQ"m>YbVZM!`>*Sͥ89eNVoN5:.%{M–mqvs9ҧ-ݴ1[pX/iM/ϝ_}wd}`u.n1Wי%ջm]=kX\l!W$f y7⦬?m;;mcms%X׶6MF[ǔ>JxtU\pQJ]L.n@͞eur08ߟ+;B)3SjY*DkGDt/,$qR]] o[VF3Sl˷,5Pq6RlmN3 ںkCܖ({\C,̜pm^w{ϫ0 ;9&<_i.k|Va%+a 앴IQ!`cG4f%MߗٱF~k b:_uswcCY]rͪ OI,?qwҾۍ흒ϟ_?۬pBn-cuI/ӔfQ q]tPtvulVwQ.hl}T[Y71AC$kyJԵ'浡x&=| 0um\VF?}îŠ[i^Zܟ6sP/ >T,Z"x#K{Xr ןiu[ON8ž{]&{`;̩pgo{kk.q57oy\+&7`V:8ܜ40Z\ &ݼ\p isq;& 7p=srt@d]8:j1物g0x+cJ)z sYRL:rŝ)ꮬ{M:W뎕#h^ťmGbmܗekG>Mƹs) 2,p^?suS% _^ܱ :S5˱y5y;~lVfY!%nȂ;xs,IO*0]};~fp_t2=رb.G  7wo7???uv?7m.mGvx2ΖçI 5l/yVS^l~=PUHQ QM0&ĖR6XNU!VH?9=V'%H|>{}ܝKGk奝ѿ/NcAgo`T65m?=|>D͑?\۾:0ֆ=1',omWZpd\-8/.o!Vr+}e2Ze3+ZO|]V|\|՚^Jܽ"V -vU5M; f?Ƌ5VeVƱ<&8;Nj'osrsx͏/ӏo?z#/߼8;pNn ăI ೥}懭Z-Yڶ c])Z^pu{gv/Mn/@Z|K|IT}:9ۇGh+z3 b~rY TYf *īub}GAh˟v,n/H[pH>;YFA/SBeMNgeFŔ4?H4=&{:k:8} &|iO܈3 "s68Zo/ ;nª)miv>A`[gʕ6r?ort l{|kY<S:A'T`& .BR] ʿZ#g׫"R .E+'8ʍ;T}FKd e$v|A1V,%{4JkP*EMJdXxJr ]T[3rnhE;|KƒvFC]-?-P{)`[vhܭ9 AL76g7h3ά?ghAU:Qv 钋HQĩ#L-F3WI(FTI(F3(\3&/ et`o] &j+Z!DL>*>I6dMIL(MYz ȤB) o#D4ƙȹgnХ>27ӵd?zpWPyp~m?j 7j݁ >^egg5 sk[u :\2~qn}v 4^Շewr nE\R  woҾowmiҥ\BH|%ŪYҍo*A6#"Pbm$ d"CeX&*$]f@z9 %V$oӡ\d@}<(>tO\m]1^?k#j6Q)DpӇuuΎ4l)$ꀁmbJc :>fDYԑuhNo֠kQ. $Ot*8od[j}jK*z]GKm:OK &yb`X:| LqO:')d2"$(RbH;v5e.FO">! 䋔E=zMv9_* Sq @b!KX<5A(:hE"QJ꠴ϊy.MSIXCX=?!iޒ#f#fsS6%=FM=[$EJh|HgPt M[Z5۱ U`$\1:|#U|>DwGxf =M`Jg|Il(lz#U]*u@ZO们2DI$Bo9Rzdr ";E.[O֤P:BY$CY" TuV Hhcm*Q5DlJ9.H$i ckՊWաcrTw˼`5W^г4OF,xQE9Y#`֗C$$wI$gR:1E.UA$E %fB7(JNiU7ǘIzpIY'T'6gU'rN0ԕr֌=l|z|x,ut-H. O*Ȱ1` E+FE#T&\Ʀ ozD-4Pr@҂D5VPJD產zbw bL/'?i۸w*lM+FQ &82*6IA)4EW(Q%hXap5FI4"J DaFds" $]2$m""L[hS 5A"|XP o쵇xXc҆QktJ:G4LX ?*ٔavq ƺ:8h!Jm|'40'ڂ6jg5 I =E מNf

ih ?A'zlv#]\}KI{:!)uQ^Չ¯f L!ZA_e! > cthUV’4. r( fȖFm)Fl b"L V3xQQdtn|Q ҪMm]P]aN"ȩ|݈Vܯj\̱hfqbЧ%j'YjYdMJ^@!&KVAZRMa1Evd02^tX 1Aϖ#(~#&,Ef\l]XьaT8،?F=iE#MV#CoK1SbڎJ *r[O}c2e(Ij,Y4K%$dYۃ5b3rQ'֋n+\cu6Cj'xGPVʌ;AZm1r2}BZ ruDq^| 8}،;Ƣl*rz@s:m^=jF >2^uO&Ox# n~|GN=rce4GaeQ'\)vh$M~A4Gƕ(_Ѥ9HK4Ğ%WL3,Q"Jvl\a74b < >xACF`T"FG]]ުϏHFy@2[4x!(Y}׳W=(+,-lWs6:Mz6_*볼3PVE0۳IUSL|NbҲ+F9_zpQqu 2I5mo F:Y-gWNҽXz`+a|Pf*/{WmʔEU[Wq>\sm$f)RKRݤ߯{8(e˻.h<j(;[ ]ʾ oQOraT8U*D{Cn6^$ E ?@P[JS^]fZu={Jg'@\6vR/`7+ lmuXCtH.jVݜ~r-n*qh֬k`` z_(b[ns(]K뫇t^k{JPr[_uW,O5/\/pM}\ܷrz0=ʟxЕ.ckEsYѺW eeIviI'VEm0N۱vڌvVoؙMMrnL%(;[@0 gΦ5bwg?/tn =pn;0?<\ؿ?~Nh-ٳ4톒ޣ ̋"W2~9J9tj ~(q+-Uч{_NM/xi'ϻ<ϯ] 3;sޮOlZEHc@͞fxgr;g`}wT9lhG f qjϠތwˣm?[F?x0Kȥ pdȭ3&'rF&) ڐjSu%(IHR(+u[ +'Ձ ZVyF}p ΕwqJ:Z-T魍]¨h:&M7y'){N(9n"csf#1ke>bμ :h麹ezvq쌹sP޴v}Q*!T=g.W0 r)GU=:b3F %QDj-{c,c\yA0;.1`Ȥro1/Tms?츍YqҼx'%oit%Fd0ɉ$`UAj D+UL0ϣR@;+D*85>&Wi*(93 ݙcqbȂK!.ؑ,($Β6ְΎ|v99;Rpl`ѰO^0/M}мӅq aTj vaPQCrHm Y|Q3ғnޏ\.J/%r-"9|v \ޗmK2[pJ%i#۫P$ˍ,Fub^$ .嘆 1^e:b818@\Ҙ>FJhbX$"1pS +ݹh41w #qHƠɇKnym%w@ FQtZ1`8jZKdB")l8@)Ѭ).IPKd hG<p4y^hb3Ij [NR飑SoqvAyxZ#IT1O#̃ }C#3Gg} V҉m8"^( 3w\tꔵ%Bn767ֻIb>T-VٺG*CρH)jڲ2ِ =T^83l@ РVIeT:yڐ`>nj&pcm&axax1zxeEjιt 5ʕ 4(s3=qMULթZϒ wD0j}%&G05>  7hoX7dolUOC·h~ a)N)|j2oM[i"7Y1^g<H"1+Y~9|hpҗ"0A[xAIS.U=^U]: =m|\2Us-kTjVkg%٠sUժzծjzʖ Vg~ճ U]"+ٺ)o<"fȍ{8 sQжHƏK/4w^<淃dgL53j*s=|K7{- ڐ"T3*,kFyeP&֛UbR%wGL>bciveGMjt6,&cF] ЍDk"I&&9*SA&Ֆ^Sg .aͺݗF6ǧ-=-БYxÀ6P;&}{T })B`FZ˔NHC e>Nۛ)@&|%1WzsUJ.t?>RP}]<YD/~c#-XLңR MbS=!BpI'NZ"k)PzA˧R(ӭO:l]kN\x0+b8Br SlKRQwMq{.`( <t֑k"+|R𠢷؄/gK7bj0S~u mgɄ1ڢ.05]"\&r%DBu].ta-th>uBHWXMT +BFٷ4NW^ ]CoΉ >1\sdcp#Ն(iѕـLGW=R"BG׮6kX[ .N]BbP[DWIBUNWS++AEtn ]\CD[ 7IWL_Xt-O%y3ȾYp(6-+ 3y!oFɴhTȜ I/:'(5oonSeCI) ~ǩx; S/3)V黴.3.ϛRC ea0"a?;)NHe.7h [Юo͊^)'ZX_|?DUlW.PQP6*oՊi8p2;L(" #0ʁKo,o:FY.t(\9WBۂs%u110!Bi3 S[c!\՚$Do/ݭ3Di;5ZgXFYu{m ]ZN\Е6iWXng5 QZ++CeEti ]!\cBWx9yDI;zte1մ5tpmkRK=uBg +[s9aLmX}}36Ck"a#Ӣ+]َzʄEtE$p4m.7m+Dĩ4WHWP ZDW1Ǜ=1Zy,7PZ_GW+)7m+& P>Е \*}Q14#bJgeb4T٠\ܕnkJDk4*iZ(eCоr)PB _l ˨.hl'mb4x,\psȃ1dpr{ޒ`)ۢ;fh=Qv;8rUJ[DW(-th)?yB}AJ aN kF+E[ Z{t$>zteaN=cWYꓧ+@)HJjU++Y%.ow Q.tvtrsrNbE 0G`CG=o(irRe;f詴Z66CktԆ(Ol画#B{ A"-*)Ad8:yp=^a:S_nB&VD+^\xgrS $Cv旳쫟 xkfvxr&Wyc@FWcoJ_L.2̭$&uPw^.&`&26Hcқ4@Z@930fT?^v;F(Y,p Z51q-MPppB] ̽dwBS ξդCHdZߜgS8AjQpk#ݧVB[Kg"["{|[i4j2.$' >YH$U$ iʧwųƨYJo?Gns_s6N:ɰ4ٸX#T/[c'Qfxۏ9}m{"0V32pV3ޥًג5.4'EN], qJ)[(h*:4?)M~Y<7VxltZjDžB'ȔFƔ趘҈S7v^)͉ID.<|Gϩ:$ֹ#_G @:ZLnuYFȂ~Gi~QQZ&0`u pľnDͨwMMncXх&>+m5j߻m̵ƌ4lxu)OuM^S| <.]xZFpSC ZslU54Aésy3J7ngt7 avx)nu~{ߖQN9Cy2혷m72d &Tۄj?:U4$Čc2!6DpKycB幉N}$KKDm(/ӹC[jK@{іJw+miQr'{]~߳/?uhh/]CsloO|Y-Zj%Ky9+m%%-:gn eړ?g3 v94 "sJ%eR *&}'ukT/8({8@,nv6O WgIึ4RìS&.)8zfE1B6QMc|5uvvYͻmddQ%=1gzʑ3W# 0"Y0@{ gllyeWՒem=WK,>a-+WWOzrvS~Nv?{z~ӷ7K W׷ˆ%WKsɏ*9~R[k4Ue7KUjTJ1O+4R2 T푞L%pu zM+4RG>d#vPeOW.1B0ybr4RKPͶq%bf• ybLsG][U6\"dgZjO⌛A*Ovjc\;Ter~7==5QW'92:uG2+;<\ц=I2N+4\22 TukǕwцMWpep`kywvr&gȬWPy*NW.va"\8TL㮠v\JNWlW in)YwAiu7ۻpAYz==4vy=4_xA`xtzcD=i<Fx ^}IQ;Eu10ʹW*WYQiǠ2mqM.M+fUJk"bSĕ'd4 PkT޳tpȦ0͒1n\y֮TJU:p}ۧ| ǿDp0|썾#jZ9L+sWv솫=rJEr/•uv\J7\ %WM+k1i\JUq嬗''I`4JFv\A%q&{WWv='9 6OY Q03٩D`r_o?|f֡erݑL?Z2ʖyxc1 ipr9̂+U{+ U W'+xWL4 Tm\=TІĕLJErWJU&q&ݿж̬=3B/,='n. \@Zp5Ʊ14hwJ[AGӥm4,Ymd':jA?:wi3;ڷ=U}bu%/ uJ~;W6JU&q$<ldXLJݕ;FSUL.LmBp*.R+nmJNWB43YiYnv3~ɪwq/^~\QE  yn3885L|' %zn}Yo(~57y>\ԳeRsĵB$ΔN B?7޼J;ć_ ޞ*hūz's#S5&-qQ"ts4%elzkR S61d SI|Q/dK.LҝiDBRΙ r\J9;^؈G4-ؑl֡;qA 5}0)QjZф`vylmO Xc4h{6.Ʊu&fK c0HAԎƬ.1$F|5~k C)(HL`'~߾̵FJz(֩ (JAJTCwł'1VͽHgUhzl5'ptapIJD>RC&S)wYz5KGZGH}m-fQMާ66&H& !Di#%#BRR=n}dCi&%xX'  PE5ÄnbNI-U29;Ȅ5.考,)";D{|֑]Z;59"_&VXp9)cAn10x{4 -<x5xɸx08Q\zpyUJhڕx3dn`8a[Fhcn ReXLĝe=( qk{vFXv4gi eץB0\u^G$l@6lmbtPPm {Z2DpֺO,.$R@?[*ߜqj Vp2I@6p(:\KdJ XG" Kn(pNZ2,X^uvL%'0]O N+9O;PcfX57]kig2aۛo( b (!2}iLa̾ZS ϋa Rl ^t H8C7aI``CLα֤ ̳p 1JA4zǣt 6\_`kb7ڛT⍁D.0͑EUP|Оe}IRl egPS.;ZUW hd/sj }굅 0Jka!ѿ\8^8(= [,lH(u{4H6Y1"LJ!G h~#:S>ib뵮_ݪł4$]16#JfqK9!hiU`ưG]@N*%tA\ kɰD(2#ڊY!:bwf_,Ga9=ŋd}fXVݺ[aq2CqXTuNv#[QW=!V1l#8&[WFNCkl=oMQJ%+`+|m4=b.or:EjC϶)B-|Td`8v7@]Ki(v?!a-4Ks[S-hW1y@PH'^oBR}j$zFbIWHڳή&%Yjix6@)ƃ^zk{%n kn%l%@w |4Y0[o^SD~Bxn 'PU<]}Ucn${}Es 0tu[f=W{ԃqO-.M-iVn,y&#<zef0c 2 o:H#Ht &% xKd€pT9 _Qn$§G{I,Ws"SQSAA@5diY2b=f=-8T*4O7 ZW,H nrJmNǍIqpi ^n lFlc M=ZkYxs<#{4#[lx,~bםpl oa&<2p!v*TB "@JUe!`GU[Co_Wu`-s:.2>3ɖ%1C"TcU@t-`ŏ=A:y[|@*\'+ ׁYkePX 3A8 dBgWυX o!JƞzM&=HKLCR.N0J p/@i0,^lHU]^63.l_4aƴlKɠ|7gojS7nMж ȮxhcѴhsoK{B =4&sP/ W*uSJ !5y%2Pn+I u%Hx>Z@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H tJ HQU{;uJ+vCka@GEHJ iW"%зR{ )ruS sJ5"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@'RN)rQ)d(5GΑ@KJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%*UtI !yw@W(G@'L;IJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%(n;Oh+^^0[ݦEBsjyEF@ 0K% ~p 7MgKֲ.JIt ¥OC| tt՝+;QjY"] 3+lg JfBWVw(5} JbBg +MAth;zBB] ])+|HA3ڼ.pCya%f܁'[⼅fu(J Z,'t~i%|_yo_}v+c6t0lZs/0JC;.Ǹ; F2gm00LPVr8 Ѹbs<s|W601R15Y尾֩R57Xb:su$("5ͻ%2]bSx11gÆ%|G>YcUȢT%|d:{_ޟ*댏3>U}6yEq>v^.K:CWwf~>xa]!`/;CW2B_;Wt(%:A0ߥ`Т;U+th?rN+N # w[E|CPٕ;&tc R.~e5̛[l8ZqO|1,s_֦{ K:LE/\7h:j+[[Mj u, hŹXOXcu0=0 ygNa5^;Yt f8.,"\_۬moFӬ,n~E8~<]䟱M|UNCPЬr܍uK1 `59ޜ]۠:7nքwrPUZ}4q6i/8ΡZbac>[]ýqz0z_mܳ( uMG/W^ki܉?~|h#q"#lGa8wʊq>/.I !z|<~W -̶Ps^VU`kKY/@AeDž1@y 3:1g 46AOo*`[;pl9p{XܠmK!h|h'? J^p.aoؗ2. A3\ͽJݼ~~$RKk]V6d 9`:~dGgc3c_ϼIJx -=8MV0 $[{$𒌒rAGk~tpϼJ\GԼR૤ҩ̙Zg Η/wϝhϵ^;D?5;\` ozi]4l%pU]k]:1_\\DI |SNkի.G[UpലT<$Nt?nKȥ{r;f܎2ך)#\BxPJ:`]6 +c`wCȽyTQj'<̅`I*x =)u6) qõJ:yԞ̽yMd[ ZesyLɗ۫hۋS_.ٖaVb*͹Ps'x8E8gRV U㞸̼ ݬ\ iD[k.mmܛ%5s+`>OZŸ.6x6?n:ҖB yat`:z~>??>Å{y z`#, m<{0_=V -U5ԷZ:u>[;>G,^%Zr $هGAprO|[m>l>.o|0/(o!Ov>cg|hعŭä=ys:ߒ`VZҧ;)~Lϋ٨]/I z;/ Z tn9DC>Kέ9;e/YnS.Cb+U[Y8/񖫯2mbII眞!99dmi~hи#.uS&[ u!+]4@k;BgֶCp]F{&4[Է# m}!/s?';gW'(_EK[!~٪ouxه- 7 >w"E{>ٖoW>(D >w"@MBuFm iF3wiJKH5SHo__u{0h*T7@CE3wC2݆ z2y۬gS0D &w&wQjA\ͭ1}lD"F}dz#d#dМwa//K}98L^t<YTF(&ϡ Ы Ei<55Zp[j4m{KeR0N5TRsVs+j9~q5ԳW3mዱ׵Tkjێ쵪$-xS[eZj5ƃ`}+,DQ׶0`* }J\c=E(%_![0խ bئ+ vhKt3s汮^OO?_|ø v}3"zm;Q߻rЫ[GIE4\T 1 /_#owwzrx"M2!{z 0@o*c{>j3&_7M9<uրE}O޽cT(#Px@ 'r %DQ `'hqH``s JBX s|-_ӶO7qOՁ/ڹͅsΩ`n|Ļv!sKjTOx32?i4.w Ҝ #*lW^0f]=E8w݁A(O-I[6@%1%ɂQOY~?&r/"Zwڶzۜ߶HJֆ~\'HN"(JPRu *%(Uj矴a_(-덳Λt|]RkR6&z*oEijr. HW]Qh+4Z]]>*Jjf]%U\ iBBC^`+ FWJ+4j A"`F!J2SPt@]`ւt('b\gi}]WL9=j1Qh}W'FپO ~Mӯ>^6rj:;1 Dϓ' .B7w5'Ǧj4jέ>B<#0nToNLJJߌiͽoƔE"f5 /FWk]1uŔ+ zIӠ؁+ƵAe?͔EWѕ'UW 伨c\/fӆuŔ1]-PW!ƈAR ]1]V&w]1%cC`TkbpfJ3-Kt]]ŢfV^y+ֳokAuŔ+P1OB9:'EWL w]1etEW ԕ+NNtŸHFR]-QWV; ]vF7h)"ZPُ]1%]-RWɰ\+.(}m_!:묵.ߟ E͗NXYr$ˡeBymTPY*zUG5>[(RH#+Uغet6A1#1G)1:3S:cPp\+FNqQεIe,+*zHpfbtE uŴ>Y(]t@]1JU04ܹuEZۢ*Gh].ɒLk]b+Ң+ẲjPgI6j汫D\3HDW>+vڢd6Q9/HW}qgi-+DWt@]N}A"Z}tŔ+cD-HW+řNi DJEW  1ysi)ks>%1;&+W2?-k {s'av̵xt%;>I&)gB[rVIPILԈz) <4\PQJ/iKaJJ/e8TlTzCڅ ݲ/!+Z 4?}ip2R4ʹk)C5 t1j-ffB]WLiu-UϽe"RtŴ>LKtD]2t('bܙ%FjqyL0w]1e0EW/FWzǬׇy3Y]%ڹ߈LUktSg銀Y1bܙOusM3ODUt@]7kA"`(FW;nĉO>^WDit+]@IO>]W{VS]t@]YKA#䀖͝tP֢9%"(PjybJEW ԕ 7٫Ͼk\n3Lasi6]-PW> ]17btŸIѕZ+Ծj j$`J)bڹN(LIܳQR6:ޅRN KarvGY<DwuFmYw5{loƍAJh\b(mLZbiJ}4\iCb&B]=tAΫ:ERtŴ,`J Ͽ8 ׋Y1(.uQTgbtŸF?)̂+c֛CJjpa*Oۦ4JlUIЕ)zhkMtEO,v"\]1)*ZؐnI91b &bژRj2&X@Ab`/3ȸhS]-QW4vE]1.ֻuŔͻ*z&]yAyRf'c.II6!YO!Z13kfe,;C+Ѹ(FWH:sQEW /^F@Ϋ:5VUbJEW ԕ+vډ%EWLkrRfs*-Aܞ{wvʅ/*gtǾ<;?3zuvrEq2z̜G/~:X_rtD?=xѷ^2^}qoa6V??ǿw_7}w6V8${vuҝm@ˎVgͯ}{W~`S_IjiO%wSFKi[c):u9K˞ gx]>:mVCMߦq>?^K.g/OlaջNzz6ΠA77kyL7@v}QU6u5mb߬~4ikL[gZ~ S2m}Z9ڟ+#FW b2uŔ ݿ]Y*Z`5McN&ap<6J l&!̉@#"" +  GHWɀRmf|L͉=Xi (/HԫvnHɯh'/Ix6,edz_vq_JaXO}7Yu<:辇x`Nl#y |o>UQb !vd0 ^MP8WH3!5UmdIf=E^WR:%#J`R -[*)xadyaqk~9ON ]rJa~dRZ$#CJ#>2EVc 38m-L ՟m FSȋ .w_g8sRؘ qz `\ĥUKV%.*@GS,)b18fe"Eqdbc!Lde$H'1)lAܼuϊ~\lzU}+X-*Niu%wi顟G`T~vpO'#C/lepD_}gk[ a93|WdZM2kyyh,8y f : ̅4S8N &"(eLQgRPf@S4;Uu X&{߆f :Ky&{eJ"N/DmiDnbQcφO,JҠ9.W1=c|?)qh' rV^-Pxa%Ky-/k^c;XVJYZhC{dzøFl2L8i1 M`CTa$~C@qrQA5z_N_g[ӻӆ-yy56 dY`e FZ؇/a^vr>_ntɼ 6CF&@dN8q N$!?aNwQ=Gl QztF :D!so!ăR!7D7E۝!jjSНk,C}^T_UXYҕpIyjr^{.1觼 Z兪}5fLa]C]0&UҦ0HʢKc8y/Ύ\ yR+b/*{*/.R_KRtuyItˇ&$ڗS!TH?2J<%iEym,.KI[g V~fUŽd[3q>S ʳbKuzPVճތqw.C71%Z eØa+dDƤ0Hʬrˁy8ulyYA\mwĬ5V/pg"f%`GТ̷ 'HʊoL8 -{42X %DJINU+og$|$hTr7"M1@!nqA W/vOl['=׾+qjoU- bw贇\1FQL8}#Hʸ ם32TgN{(O|1"H!?,OCp:e͘y0{VC0qr#80#YgzTVʘNF#ȳZ̊< D1grNq;s71XgN F}b`eEB~=DO'Vߗ{t>mV~J4 bRTVaGO{K/}[MS(QVL2$"LrJRmKl*lG͘M0JvӠ)dp{Sht8Ck 7pP|<J.A[-OR^<_/yQvoE:+x=ý_ެ9GPZ'.'hBKYq^3ߐ6+1% ߠoWCt|KގfIͻgqWv;v{sc>o ܡPJۃ'"l83&1#BDvu8[e3~<v&)ΤuB$+%ɴNa?n zO. u0 nQY&Jςxǁ- i:R]IK Q^abV\~<?;j;/wi[xNa2NԣR% Lh$G,aDLF4cr4 [} /A} Dw&k';,Ef³W<[@8}.]k @DUq@QU1PwjcP18 \9xypn9;tʭEA6N"8txJK)BTtbiDS{JBgW XTQ*=hxy^.j՞6l޷!F@DWd?gei~!mPݷ€2Yh 1t_3a3a/f'3c.ݫ""vNd Cy*9XD`GT^~lѼROgBB od{1VK7#WubKi|0x\p.x*,\2H9FlX1v~Be\Keh*`n4$1?@<7gRdJ/*IGZw@s+0pq_g6.)^^fL)E__*.U`3Q ^{d\'E`C Q; aRM1 .y#L1 }&-*QOJ]ԥ!LCl:+j4?1, {ϪTݹ*)ou%հxП覤iU9lw^,|,uQL7O =2RO'ͲRKo(HJ)ϭP?I:8+^z?t[Fdb~8ѧIzf>_>MO؂S4 @7q,@)Bt~xxϑklg!PB9!mQ'OF;?aQ1A UiFSPd"UgI-T\ՠ+xtVzATבr粘X%,N ٧ci'x.MھUo3RZ)Cr!3F J2fd"NXSO)/1^G$ɫ+=&{>kH0lKF$HRYTD9H%ę{_ѼWIZ>ڿGȐV./u$F~8r1Pq;<"c(3K$A3֖=cv\ VF^oۚv)n8 )T͉І"0f1jaQ@BV""~.dyT$MfV5h~}igbǵ&z !K %mIXupgbEgt 4 @i;i@U|]_ LOdHcMxs_-(0zV18-A=%dm>|27^78#w^_B*j8n]}^O6UƤa/.nvC>9Da8_a|݁gg'i6t17&y8$D6c8J/hyi*.*O>}(ɏvc z8'^"#0,6+$+th2 +~wx<,yK>"%M_}ɇl$%[b13,ź t}1zF7l: ҴzuKAgA[&&~tQ?bU馬7i\H؜/>DNDk;o,$c7ѯ\FJe-BL"΀Mv9}֧ Q!o"`Ú&KikPP67N|<M|{~K%2a;v: LJ5w8Tn}KNL8Nm;#<U_1UOD#5P]x9LwamPԞ1be,D2s5N SO#FWH 4\ "H3[ig!;fDRd3kj+o+º;]\M%tAXG&f({$ȅ` zϳ Op­8I8#9& Oy@iWWBZ\E$Rv(Et y$UG˚}oo'?^,g@]`@[8P!u*KGuh+Y&%$a I#EDCX2n%E9ƄO{{Q9mk#,!Ix_sOFѣKwK"^~y\N/^f3V2MTkWQǦ}L>l1U[IMu^_ *SkaDHAQH1OCdp҄bFo9m.>+hﬖs{ a 2.=}Yk:܉Em8-bfGw^F͹[v_4\Q cmL \_ŎPl3`4v?΅wL1hV*QU]>sߔwUĥH*٩ws9$>qqY2:` X_,:6q+R}1>毗/gF:3ڵN^ٲ0>~C^>=i_Te<7F$\S -B4S3dߝvTFΥA”,uLU#O*>!\J%Gyx*`3z{RB(#> zPNi8~ ֩d_Ư_=b|?@kZAFp5lIcsfBX¨%Pd.*M@Ё;lpxLo^x%%Zy]ۡZàjo/GpbWQ`s B9 j1MB(S[`]zGVUޏ|jg{jQ)JzPu.)B+Rd-Vnnڪ[1b3F@F8PUUdL2QMə@a[ Ȑ%iv?6d@ =5}i]*mX"!աDDN.ӷna.&Z0B9 V:14!YlY_c[z-`ل#^(lf\oK # T#͝xfeQ5> jmV1r bDDH'Lޡt{Y;=֠Yf߅_\@ yhfZ;tKܿo!! LARS>2,Ņ(PUl5َɖN7cqQki}2Y`wtE]K,ȃ:E`V)ѡq 쉨MvpaZgDG:hTT\}&wTM~Df?ER@(9bB8i1p\ AZ怢)zȍ={S=5%vMf)?Tl71`WQtdg_Ǵрwrۗ8R_k*ʱ]UJifa{PfX \1[[Ng>t/u#5{NR} V4/RWJ)nZU!< EDZ+_'E<\rP9v"%(2ʬlAga@mAvIŅM;9[9`.T'9Zn qUJB:?Vކ %5\'QFx(2U Z{Q]x<~'f}; w=J$Tbߋ/t]Mw,,X hb+c ^ 51l̔V:e_8`gA.]t_x՘|3ޢ&Ep$3 Hɒ"sApjX3Px˵qȠgu;ܘraJ]ZNPy+SgPqHW$أg)4-?={L>41oG̩>6v-p642:{0*Zi;dYS˾QЊ׵2>%"#UђбeyWoW-FS~eqEֵUKc?g橅n#8s!p+C3GR[JZ`ѰPv+p=("{!cpФ۹X+ )J 6CYsNEPy2fI͡{S]> W wP0U]Uȅ bg[U+1z9ƔZr>lKN֗be>ڇ֝jVf[Z#l aξri;;Ky?hG`t'qB򳵰jomFDv?DAWjǫ8tB옝86˞Qf*Ȗ{<܇ϖ?MʭE. ]Qwczw7[ etZtP]gϦ;{צWn q̰:K[^ sa0orָ(%uv@<{93)**dN1*-W5kl]eZJt?{uy~GS籇0t|͋ܛ)\>g k\XǀyϝW=7|_e6P+S;CfMJ҆5 0xU.E6==@[kAJ3 N$46dn 'ȖSg.{K|rrm2֟y{ytAhOsGȓzR/mŻ54|>wc=/5LIR|ЦxjXcPMCU]C:*Ι_ɶ^Κ3)sw8htˢ>\}=.%1;c>;ێ]PuÑzU\R)O7ƺϦe.[OEx/Sp-"n?~v<(߄;ۧ]5dOq*ҵHnBψxZZ_>F?^TCoR;91̯İsjSzbI/DLMUSKW~'AUk^y`%hVdT,ߒ(q}:,s[2LWd,ߵ1/Qfo?+;N\sT:3`N~)i˻0`No$7ӗ UZ)Gg`zO_7[BMv3)ob oIO9 kjBaL̃7(aS՜}BEj X-+GMU'*4t?kLG|D$5%>vecƿS':~otW5siTߕPe^ AQ0=fޔPlNSǿ~?;D0hV땆 ] $='MG@ഏC04 NkfS@C|x?2{_]8t)y9zy~!0M _ T[(U-ѴDZ(tMRA[9iIRTn5 iAxud(&*QU=vwi <׆K*oCI$ wf9KG\V䉂TrRDv/l.)RJ@T'*B1WEJJ JhFW/v|-!ub!x0@P?Gw4YUt4w9g `9u%iI00LW->F7VF/ax188b6z~7AiUVoS"ioGǎ Aq*lNo#N5L zxo 3diPk@0!m[BH!{'<78*%O=w)LU]$pX4+_:hӠ%\y2f½a%/%ܛ2$zzQ/PU,#sB/ϖT2#K}g K)%8碦!;S޸6.6Q1,U~Agxčhi1Eb9tQ~]Ug&A0:p8-#Ku+ Jtۭ+tuE=HP0<~(<+ _{ȞѦ\oI\+nin%Z0IgBk/RzV(ͧӪǯ6$Mb=4c#4$K%@) 6G1ٌuVgCSSuo(w kGҝzE&h;,b1wALg;.edB g8us'4^G<$%Im&y#RPX&Bes/MzMjXFUPB*(Y?ActDc`w.5Q+vB'*y@Ne.doUYQ" QKYX?1f "_ۉZxDOדRzCo"ޞwz!BHJDƓYYP kdOgdd.ӥC%K1Ult gl>M{c@{i(&s4͜N4&8_gMrÿs9g':*Ma`֨07iv8z.y\9n6lVݩٽKKqމK E-aQ \o8lХa/cPmLBi&'BdMJ}ٔL|IEhMjDoQ^I^DjD `,>I} knp\b1,H ϳ4,o9b(im!{ےiB P{q'jhRoC7ADp[0q*Mlܹ'|Jד^Ԝg e 51SO4l[ҹJիas(hZ=!I +fbǞse Gh_ ,A*vZ7&`%1RN蔚5[çxC9lSL2e޴QcDD|Q}&4kiypSc9)t). ~C+ aY(5UCg7><ন$iUnXj}*-TU`R~Òi%Y{iGfЯ:4ۨBDG^Qv@q2[1Sv8eؗp~ 5B89-ǑhOt P A'&lkxc"9-r**\ODkXQmIT4N#㕎ẗ́6ԗM g_u!xó~>mW;7#s('~g $Б ݹBߜ#? 4=6L8 ȇV2ʛ~33^T1:?*s}8&]H_?w?u?ոSNF;".[,h俧ޙCX[p0:muM戨E%{él}ی>\4}sҹqn+= fܡ,0$7vI ;_E 2asE] 1!3sr{LɎ<:q{ջ%/v V4n 6mR"a lLPw~uh~Ɲng:w+`VOb!?J+*g~A9+duIQ7cM_t&ypB!{ 9}p(qv#Q2odkZj):zn;_G DڢNG#b7y&'B:Dz$%rEWd#]%-dMFUfF8Na d{Ty)Mt74'*'vC Ͻ A8p]5eGMLd>B,ᥖg %m)K.+HjRDffl<.`cDxs70H)lgo.;q w;vC s&VG3#kŴ2nC3=tԝ߭6^u`EYZӕQ`}W՗>u&0!>l3z؏1rEH4 fY,/S!ͣ^땺ۻ/4y.!dٙuILCa~yӮY^#cO.ޒ6,LsVz2,ѕYYw7{#ߖQ,~d~3\yeZ}+n1(mP0.".xLx f ?ϋXIPQ՜q5k66* ո`x,0JI(%O%Xe1ILpe'jn ͭBE4AV6QϦ c0 j +`BTjI㹬B— S(+%!ZCooMbL"Ǡ?rhv;?YT$l|Pi;7hĦJGQ'@ $P&6Pirm5K48dFHoj>͆Ռ= z/Ne;K%!Ƃ' * oujha> Xh_zW{hqeOGuKg4ttCrv$~uv Vի-SxfMe]HWJއ ie(x,s8h{mQvF9ӂ VX/JGjCm^Y"x8z|L3[cJG"?T}>jiҖNx5ԫ痬-ilIg%s "KB*KB.]>LΖ% e 9Z X9ָtRRn .vD~sG);ѼkEASotT^c;.:gʃ+޻|ۙ gs*+DgםM&~BSIh(␔Wڂ$b R'[=c?TQc) v xQSLֿfW:̯hҟ}x)U+t#/?~q=ËpEay:ϰ7`:jn;鿕}7B.S!ͻ?-gS7[kA?i߅N]YXaЦ^7n .vS|`,/V`>`B|B1 !@Yb#0gu$skso3sģ=H$؏W!zW!zUV\*BJ21ʨCe h9(JB{Wp*1G rm:Zcxoe 2B!v>/g =ϖ w Yc$Be njǁc S\Eۯ?%vZ&O{/?u/غ?tzuVYO~?MKy\2(5ia#+uXrE V?RVPOiLf̔#iܔ6\9NIsL0 ‰3@;B4$J#-_ 0Ea%~BO[+ȸ:Iu~6YH`WPq|g2&d(To7 0RUtc UH! )$W!䪚BR H Bh_8z;Az/ gȫ1 b{'}{@IX.G - Zߜz{;}lrP u R3ˆyY^aB%#GKSALVWsTj4(%B(*q y V+Eۛ& % LQʼno$zYCh1:_}A ==]z$3MɎ? zDFJe8~qLg Ί 2gɣ Y[7?͡8_*oU4w7 Rdh8s `+X ~(+Y\+K8~7i{pA .sɶ{&pJ-mM᱔*]RPJwU-F )g)CVs Q04O ?TJT?\CV_s'it6V h! /S7wʹ_lhם$x9/FtpqSyh޺徒?HI*ʋo^%}yU|31$)eBg"k@wٕxԟ bs+.-A vHꄻqC27l6 jfC9Z^ge84zo-MՊ؟R_=8]|SVmNA}k߆uR$#W( &PPLo>qv{sp_;9ghe[Q@A@H93,+$qX"alƦoNWz3k%@5tvn|&h.W~1Qmj4iw p w4;i q_}F$ޮkDz/RM"q\9ӋOMf=p8ݎ3TzhL.%CYI$F/v[#H ?(K!aXCeDi@ʔ"dSU ~,V'"{6 r\Rǡ#įqgz$1+'0t(&)AksZCRF{{6ள|s cEm'7fq8 E$*U"`ϡE [3MTyQ̟ʅ. r3w-3[6Y2糟M+BthY%/Vo s>D`|2ZZh ;TFP[ե0??_L~,%B.-{XA\'Le1.%|K*fgSwGkwi讴!%N*t̟.k^M?k`/ J-8 ſ ]yq|[EtuA:tq_<;Ztq̰RlB;~Xi뇵Jo֊bէ//YWlQr~#Ob'>I##d>Xز<30s_skJBoθ9`=a*ǿ7|\JG\V#CLJd}q/Hn0Y(-ܭz#rmA\cv LN,WY29k3{Tz*ϓICNbQFhZLv-/<2Z.Dya"RoٔsG07Hgg-$!}C{e s)lXxQB>Z!C|0G+}4Mﳻ=[yQ%'^2P%>] wo&z[(%V2%`ybB*n>xiIKe5&wfz{emB˙TH|Ut5>=:5v  S8ϭnI Җ,g$1)cFqm YFbU.a`X{Y%xSX LJwF(?)eB1G,?4$*&4jQE1Js4 $4F'hk] S+.bi;G)JEʫnPV`1ёE `'W:F$J4  (8Sif4˓z ](w2qB ms$=Y͟}&g+Ҁn_9Uq$Lr g8GQ.1TN KSE[ˮQ-M$f+rV|%z/_ j+BUgd~xIg 1EyeH\yIa,J)2clǣ[-9ݎK95/Jq卑slfB`GN@o9,Fhm\wW)z2ɞ%ȪL:1)3U7O>Pi0IhXgL9Tq^t3qw5mk PqR')ZL- Y:iNW}ruRm1BC"$#4ҔP&9і2ߖvQr y yyMoۡiNXK4ie"J0M)rtGHP #h6 -eX9cK#-CQsxWOx#d(LZ(iePpY*]6<Ο4֩4vx\DYps2Wk:+vfMvճEa]MB99aoon,~|t/ PJ|!J߁Z(m ޷KkMJfE7l7è^}p%dTd{m lO \&E`MË2Y÷ D ^@t+uXJcum`Ӹ{A>٠QƥFƐ5}20袃82+zq1Λ vP52& 'lZ>5y$2[fSp]|pp|)ځ+i177WC Y8i6jW~{60B]520!/X;.52Bj92m952޾'!1]~-C_2vB@IڄǥF{k`} }*σ+򑜟wo;5+xSmK Ϛĉ#s-+LϛŅ7@ ǚaܷ_SPztvu[vnm/<&.v52ǝmAWY*-I ;!=7UXQ52l L'%QyM,? \ՇN7a}8L91K<}%QN [v2ajdGsFrz $ E5&Q"~ CVeƀr J7&k>& N`,<, t:!Wi}C;7@ @$c #ѦT~^ir%7Y0Ci O_Bz)89mr:+Vno#֌s6J~Y>(l?9d|j{DlhX~vy7  ş ?ʿ3˿ŵܮ6򏻩D{J ؝@?J̇h\s_sk2l)DXRDJ'׹:S. J FL1Ȉ>%.51؄*{xT\bdF72dgͿ>LXM6qplnީߗ&Wٲ9@8O`1@NT@$7W=⥝ݎ]u./,KXo4,U<+m ?$sA q]wbg#vj~ vU_evth+R[#)=ɓٸI%y^#cضYGUk}Sʐ}9%$ikd HO=y- h?,`f};B`H7Mr^#PvS=}G0hOƛ]aƪ >0~-)LΣ7' BtH=F``,4SDW+/uA Ng,F󅁲aӝahWwf}U!Z$8.0j%XO&i_akcpH_b ש B;,3fLW0 QRY ǝq'n*$9Q cA2tǺ3vj + v<`aعt\~<":{uHg["- 9r_H]Wm;rV=3J{n50B,k&ZyT/Jg0{N~  Q n[FgДZ2.&M)j iTu-%썍:4d=ee)ə9́8o@#h[pAiz+LkMo#t9nTЖI[H>/Z6ڻR@a)̔8J>03T&:}0Y=3|=wKݾkƋDϋ6Dbu#6- q}0Eg{5vB/wJMt?J !ܷK*{4E ғ:OiB34=s#u^>{ooM F{prB+1Of&.YCN*1a*Nr:,OQΥq!Ih Vn02?{Fi,^4$g&< XSe-n܈[,2'3NIMjCMVDfX#H0h[}H闓Z o+҂ZR?j>ټQxMQfyFhO&ombDkSw圴eRdg| jeJq:\bVŋGW huui,v(.o iTMm2=tLLzJ\綖bڶHqreu=v֐;9B}Nr{ۤ ǎ Y-Kr{ˇ+4wP}$|zz6?!| @nz:Io+RT("ÞLJ?)2n6}Yjڵ|(Ӗ/ WvǖCicNsx IG 1! C)죔yf>!-GmAI0Ҧ  !j!#Jed K`zj6GzY9 hL|2Y"@yQi[dprLb؆BfYj,,!IzY[;IOkB`!)Z`|s$ ryEH7oTf_-VI;+l$ u6AZі*)pi: %Oc5(up樉hh1=hq|O ]YAqa:҃GkQ.rh}u~ -QT3>G{r>:}qG2t i"\$aM[|UZ4T^K۰չFluz-ǍjJS"K KP-)TM=NJk%CKMp@c(TRF$ ֙Y8Ҙ"yiV58mpAz}*$EFDn|$ju&ࠢ8 `t! 6|DVj~4El( whJ;٧1ĤZ|27AτL^UT0WBDy o壓ab=K$a=0gԑ^|;$%"-v(hd*!)sȞe.f?<`}H S7U3٘dxiK"x{Tc?HTr2mɻT$"H\J&R'Uq3-\ti%!g-:X{PptLLduw}e<:Ʉ$>g~邊^iygZ"2sQfDcJqˈNDŽ!6ژf 6c`ixCV0UI#C>,k?l о6Es-sINnb.ԥ,uMr ebAs3Y@b݅id/݉VaYl"o)Q 4MyKH({9rq2OьY\yRq#rL_e^qA0\nȮ0BD.k*%4h%I€^8! !X1Ӫakh7@OeķgD\svKgdp.:[ ȮIIaI^:4 YZ7 ;hr@vWXhZ#?c%b,*ٮ:#|T.ώ{hZ8o9'ojhL.NN;zb_>/G2kgުh{=˴ca $9I;-gdǟ WZlj*Lj*iuEi477- 0Z~# QU2bt'&S,DRA3LiNBoJ)z'Jv {#J%в>\6u$\D3OXPh"`Lw!@m0DV,8ʘ*99-o' agt n22C?zjnyYHP ND$$+ Ӣ>ѧOwgHFgSRhg' %̚ox/%iNLu:Yy dr.8'XL+ pCr VA% ٓhDiI؏\UpO`]ME|+"{J@hspߏhh'X;MgQky=ha><"f؛6OtDL:x޻;vG;֩} /A+ɓz”4Ls.~DWrNI4|]1R9}Vt4TX,a#GRGsMC;'V ˤDf1'&V1FtI)+QB.u4B .ݨ|qK^6^$[0!Zhl_є} @^!B9 -ς zZ|>2MzRȵmK3.EѥBCͤ" WRd^#j*,i zS+mrTkysQzDFp]Q{mȎ)JϐnZu.t vkB6\Ї TJ^zdK؊ev92e }JpTGe\427& pG ,,vdyRݎؙHR {[*B] .2J؎`Դ=`0 lg탊0C(U8 X"qN Cϐ$ߌQM (Y\H@jzkY nZHo[= :YGmhPz3d1.@QC]9薐 D%Zރ!'ה#A*463$ >>LP߉6O4dfU^^I[E3Ӕ i묇̃2 u\&7~}} vJ"γ2IF6&,&?mM/F,Ηbt _Dr2G_6IJ$Lܔ_a i5- /N?|\OӇ/]rH<Ur տҒ{N;e?3ϼ viȊJ\ $t0/Q`K?&G30AY +%R4~98L|OmxR_THX(eԎ?>ZCs-Qh)va^_ !@jԪlV# jvoq6')I˓>-TV5...կK@|h5;Cv9Z hz{mz$T*D )͚FJ22=<><p[&X`ߑQ#Ň;_P\_PYvQ9TXosqB~qfz9ىo"`>i7~aW]ʸBa[S+lAWlg6%%&[؝ ُXko Q_|!()+M|mkR}֣>\ՠuz{[z/NTR+2Yw+biq*pJ8{/ \Kݼ;L0:6;['猀y38^FXM3^EݮfF[ڹ/^|Q.g #5Pk<}c~_Z: ^ȇ%e>y[r.6^z~#>k=}8AN1C^ÚyT;* x}釙oXNǥꖵq*v͍sR°o7v 6J@ωa;}ĴF ;vn ž=={[.0YO%)NV%kmEn l {rHWr/gkp甜X1~5 (yVG')D6۫3I5M3pXkht#٢kL1RKN(0%ާeFי-?hߙy=E\?zK$Tl$sv5*qjKU2 9e[)zcV %Ԃ|\Qu@|dZǚ6ܣ5`.TL6jSnM?~,Q8̱k4\_$iԉQGF+#I+;G;.w )2u9bCxdT/`pg(<FPc T->ںC?քYRS[[ 2M~4((|*]*V!r(o ۢPj$Ėi UZybͥdz (8; >F;&]Q]Ոez}\$SՊHj$TFoYǍ Q᫫QEUH8Ǯp9;Дv>4tD,ʩrl)0ЃspkAхrEl*.dh^>L`R8Ki 8PC4e9zMr}`tfxjk͛.y2ꚩј)jۻn^ rQ0;SDCNnuD֛[31"H|>\-)C/R\wx\VŠ*\7f(wP:xHZ&݊JUMVY5I_ϔtk*٦& !jRs$ YU7fUPDJ-N&#UEn`7F7꾃KwȦgIpwMD03'~@mﭭK DJWۚ;ؤF.6ͪzv+bSR$3-I(*V_+nggC687 xlW"J1lnåJ-g[/ S\+qWqK皨tB+)sjl^$2S$DK94-+'ǪRzT!NiHS䛻otI"shov_2:k_/W}͖6\N_lz<ڇV>aE}j{0F<~fý)9Ed$#VM[f[6П?D.1" _^t@Cmt.;ӥo-QlK*&QDKeJzuT j$0DkيlM.iNc |~^|xQ>/<@C5Ø\תL¸śvRM:3gNyc>ϋ_ߴt/ʛV]<͔;]=)6IHfT1fk`Ԟw1!w8P{&d0L7ڇM'_ʿީ_TiFE{J'8\3-n[!n{`?#rt߸5R>P᮴J{ H/7a;|A=H*q4w*z+Zq4LUW|7@E v7r2dšҪ1tԹ'lvN-=94u4;U={.6K*@E36 c:SY V@jLIl_iK51 گ>zZڬ+! ~ye/^ @hLG [R$xGDVkk]uJW^0&G\ Vđ]cXrwE:%D}V0V׫ J ʰOOh~k'z(c==*VzZyj="|-p/i$r!g¤ĮTz¢wRer=Y%\ɬ0U2Nz_JfOv$bPݫ.3)Hζ LśPPAmK&qfxFMbܭ9TW UAtt[lP{˞:G3ZK*yBSGm3߅s AnY2Qݥ.] /זwITr(,Xv˾O"FP3wY®GYѲ}0N+gy. }1!sr;–9IIc6ث7Y w,5[r9iuGW7j,=Zt9&%@QJ!xFks=}ٓ_7ڋ"м-^2Mmhe6D"JSe8~ڏ34Gۛ?^]DjP^nxJByNk+1ޒ8q B5WXBZګ"B`[2PXdgdYIn'%IjJ[DFX^GjvFV^ci89nhqC5 e2EVUd_DFF_!V*FB 2yQ35 eC\!6_khף? x!t 1lW;'B՟V T>ѻfH'T;]M15_QetT/etgn o/D& OcHW*WQڡ@B /ujze՛-lY -o#cUy_M̑8BR|"S S !4f˕?CfG\&V#/fLk.y!E6p_(c$Fk@#AHן3wh~;Yu]< W}PP`D QP/{rsBq䠈/ONm\j /8l )PA`qDhNд2En.KaBSWQ ;M85ͅ&r-hGBI᷌鐕=WZf!N(.gOῦ{j5"i$.[/zA ^P   ~|<΄?ec[NGN:h‘ߛ_qdLYÉ˻#o//S|}sP%Ͳb:] 6:"6E\t ޞuF\a-><߂Ko U('FUof jlP*R#H_e0 <.+!G-| q0ols9>`c'sM-9O/$<17|-ȱ1@B 6-ӵ Bw˔>}e-.r̶& u\ (A^ʺ`$/(֪Xv/lp 2!jR;ZƘSh[;>x5|3U@,{ԯ4I[1+sXbrUu k!DF¤36th L&h!sZSKCHcІ3Y@$a >Vѯwec<s|Kpz\]yya˗M,cr(D$WN *6cm]G} &o{9y(r"(/c[(pۚ#HD*?m.<]9'ZdI)M,sl%۞ٹ*()>sn͍w;><@ 6EIid9 C[Ƒ۵:ӿ?tas4Ӿǥ}~VkK) ?k$ԫK^L}Gv ѧ96&޺eCהB(’hYJ"O_/C"{Yn~u"4~We< "O3HU''׃?BF5(IlGuBul[,+~a\sԲBRZһm/O^PI_ 1NO 9x a\Zs ΠP\㒢9B@֍(VHkEs.L.W7A_|ݣ|QvE.y1G^VcXAqkAiaWz?j "X_R^]k*y`RIV`XkR9fRM&@CNmYDKZijsuY!߫3ooHvYL_Wu7ybam#U`L*g\?]]j՜7'e26+GrZ(a"Jc"ʔzkLJD/ &)+48ъ?9,Dwα:ev掘k&ͳ,ڸ _sގd2Ls<kڒcy^نLPCk>͛r'\>Qd 7E|pzxښ߉al 3[Yw./y90Dr$c-^î92'/`X<ƍT( (zٔ73##%oUm-*[ҶPzy9gǦ1XL5gm# BU" @^9tKQ͘'#*E&)նD؛eGZJkF_D}(obuN-K ]v ãMR MɊF*"XIjCD, ײCUX%6)^j4)(@&0tdU ؗmI.HA$6GL #C!/{ۼ@rsn؍;9D􃇈꽢[yb1aj6sh}b4w|@rP겫ʹ;.PĩMf<ش"^Q +Y!ۭ[qLej{'>a⬵eeXtL5ReEh?0]n !g>,tK!E`qEz tSx&YvX [u/W\x7A"%apܜCp-%qDCA QX+v) ?&1 -E#Us=i-Z]Zc: ӄhtnoN W(hpibznz.O–'[> Kcß6%3{ s.y1hQj1 !Cx{Yy~u'ؼk.y9H zM#v`:Șք:A}&a\Dkx0h5!-x+07?G({"&$[ktY{-7bg}cog92&A/6 `>!\qO8;7mmeNύfgG||G#d5U\b~ց8ȳ1Ox( Οvn}tQGw%Ѣ.óO x}`T_>MՕ7^L8E\y,yMyqVO/_+Q9:: zOMO巷 tRկƅh}}=.N|jIJ1yӲR7- ZF2'~S66-Rv*r1F 5>Lv.oWg#4G=`ݣ5E*NMm)gh&?{$ּ&&֍a s`KMqNA{\j[Br:8݄9 }fj ewzy1D5kq:Hŷ}i_{^NL`q,Jgn@ɷ.NzwŻa`B+>V3A'֜3BtKܲ⪅ *C9C[{MQ {Sos_,]/d-['rroR73l33{i~Uj=B8^~&3 }-X7gBe:ȸ߄60I# cw^Lg?o'DL~pjzl޽&j'X獽ydz3fL AΙډT@TPku"(L 3MN+:Fvg@[pX?]H4:+tqVЯ=3 ZWOpӬȕ*(VMJzIH0N)ގ5{)u(EAJH5{ʹp}DzM>>z&3qL^AΐI.sc1V^k5n#ќSQ#Nl=lLX+,zcrS$jc츯;=p{'.=?0OR {~? k>W#MpN}xwO/z/=bo?"fs^,U2-+[O{r&̆AGl7? ="es@6PbOK"en?{IՊ#3232x@ؕv3.u0xl3\7ۗrro{z*EddD0Eb ɣy.FUY- %rmhDr8֛ɋGף:X/*wAՌ!SrƨcЁTZ~<[ň:v`-[:^bZ Aΰ]綠 6$O%;L N bLĭŴ?eCZVD%C6s1÷>r|A6.m`]ъ/J$>8_U(LQ'k Q>cߕk) :avRsNfAb~Nc}8I âVE MkBҡ)ƕ1պhGPu6$fҨ4BC֒l4*cюz@GSxY =?wԵsGM+susGFuE"";$bٖ*),m7"HatHK/H;Ң!~5ޘ5dE R0xłcU2.\} D3Wbcrch!v4RsvhKSy(NXB^\|z \gU%c KBӠ5)`Tt/.D8#0"(/YZWoYfF?nt|=?T`{i/ƥ[QB8swx-"YB]%t(ĥJ3Vr`,)U K8JNYk+PS>z3`-&)P5ZL> jv%pz6ٜl۽:M=P˴;Z;i' UD iɞv웶無9#R㞳>~;/")::ւ2k+>u*/(jK-juQ':06wDN*$j1>DH+C9Bqp0k _j^Q u^eWCG^Ikx*eGȐ^#W,IҶAKsW<Gvn}+:5c$ɣ@rs' ;?m4|{l-۠"!Jgpm䮊(.nV}ۛc7Jle`Ke҆| zlRۅMjR}fm*,8#SAAsYr 4,QSCgt_B dNVrh&B.E- i%X%wJaZr^e$ aE[B2ĚZSfȼ"֊z5yxk]\|dZ#BEzmwIȁ{jF"tW4ނJ/'6Ǥθ; \\c])XR\-ZObACXf%!2Yَ#l |`T oQ\;OV&츁mv䠗E; d>gu9>f֋eB3A(!W̡$q%TC̉1i@cB$]b  VJvP cZ45 +$/HQnECDtf:Qs[f˂d R{yɳ ʁ c`df%u+Bܓ{#wY xw?хഩ5TE(%jjX}K]*T +lKixoؤrl cAZȺuk1&)*D%#QpHIktRa/[9憑*JX]00hвTܰbx6SvRs6{ec [M![P2k * 87܊eۧ-r+F dO%]ף ;<МĠYꩤtnߔ&RǙ^. պɊ`n<+.~JT%fmKj+5{-.)@ Bt(4*7oi[h@!n9͠M@;aGfTaԊo{ޚNnv 5N:6˂v(cC6Z&hYq4|,N؁SU@ށh1~vO ˁv3Gˆ2Yy4&h9t"OSBQX9aa,+zNnvo3Tay?*d.,yW}5A|{Y\_![k#eYGm2(_шarRWJjG!9߳B 71p}Ȏf:rl;9>h"wfi;jOL(*TP8H |~Zg/|g',m[]KڠA(ǯgON '!=3ܧlo՟''M{>^ֲ,\9}RiM_$99f{^1R7UPhQIGMZ-c]VFy+(T={}x~}8q/rz9ivNf|]kV0aڬ#Q󞧝+:e埫r7}<}*P 3%*{ڀ6%1P{g@׸U^޷tM3Hkg&zޫ"@Ʒovnww^<76]~sza3%q܂zU~: CJm}o٤=g*?s"wfcsy{PsoϬH^Ҭ{G맻!4o}qX1A.6M-W42<KO3[7.9僳rzʛYg>xz}.l :[R8 f!{e΍? 1iZJdZ7-xiZ&ct;}~0=UN?Xld7G~1ѱ[KOsE d6A=(f7"¶Yץ,Yy[x%xPl)e 5h_qπЊ<=x(~Eɧ))I-\;mum+3ՆSQgϙӎru*Aud֚mRJ1u"gAD`L'qJ笉Cu!ҁBVʸڍ""g60nӎ\ޗhJ"G"瀵Ɂ) L] Of6 U?ef7ytꡠ Θ8w|,QKD.רƨ$]=jHjDx'7\(&ϩZǃos Ɛ{yˬԒ9Dwb[6䯓Z7ۃl&jSJBIDIse.K.l~۳ |eW,iOퟧirGlY:Yh.G_ZwMoGv?U8rUʫAθ;Wk f1;􈻒fnWs\1kӁLEsq]pĭD!WJ </ iz0} W.{|% %6['_s'52w$<\ri0D;M:W'%azvS<«f{;; ;[Jf>}.gYba&b"65$XNyhVZm&[*g[ܱ_tE)#bPbYĜNٲa^XѴw>muO3/e%T-y؛/8ݷ|-Nwd~%Z9Bh[DTJmT.JzP['v-(C(ꗅ2Ү&_#?/tӬ<̬3SD'{L c5:g< AaQbR()8O31Ap|&p|'/',xG^[NZM(j4?.@>o7d4俌'cNnztf*T4sMwKdC>w#~ MnĪq'9Pɹ1ׅKBpm}':7ש4*]h,DcM8g\X72.Q^3VZϤA$(NW uHBejzP vzUC\5(1|+ I|OgKq$7 #G&T{+\xtPZ5WQ^jҏ*'?͞zwU~L۾DկfULS!Q]P6 WitJD!].0%R+['[RnWWTuJ/UuP9o7=↕hlƘI^.HtZq$SD@rA*=YX,4gnEBYz<-Vykh.)0L.Mh/ kd|exWEo50{/t3.dT!(FU'ޚf5@ t+CIǁ%A/-)j6Lx. -"n0SLN5 Yv޴`~=aqFK pNaz)MbYpI=! hڠN.*D ZpmrlpMΚw @3:`x }~?yC(Èfcva\Jp MK%(`9yA3rqv̑/xƀyCRF(46%>t\EV̩*S;ƺ=vGHLE[g gsz!Eow m mM%E~^Xf1WDjgjɯ@~(CHY|;C8--sKE.7ClLZ$L^{{jfݏT%m)z zx4Ie.{ct:*z 1@{^V?jnYW8F3\jf%9m'3àomCedwxOlRIם29$ڥs8?|-N[m?rT$ %~ǮĊr zN\ATk5D%G:jʓdIč8OdW : a LWj:A~zduDh~0q%MWAvnMri ;U>X/-OQHULBl(ƼV$[*d'E4ǰSnㆷ}X_Fk˪"‘%:fMݝo"_EG5Lc E)#s*81Eз"%ìmPuc<شuq3ЃXĚ%})TK#͎:Y'(v7*/Tpϝ҄@DLS@r%i= G10]NW\WD֍}\0`L=asT.D_8pNW4@zpse&@3(?=P:riڇ:sRHBwUh{)N2+ġU"3{s6E52χiɎ9Rȃ7eu$`ZYĢH^ xV iztn;@^̙@EvÞ+LO4!'IB%!;J}R0"NPRG'i r# s謅JFJ`<&96ȉAN ډv 5ZQ!@\ AŠZUE] >͝28Q;PUv*3rh1pʘVO.A ʴ!; ע "?C .hhtuc66RxCQլc}`>PX0HwT+U0w70)aQ+w49%%l)CUd[nO:KUz)ipz5X1Ѫ#u#AF9-kVWPԸގQ\q5l?zWCw#JU}j:^j)E>H9W(k:tG愤uQqd1hǷyMMV5@ieঃ Ag LYNZ<)yRe(u@N=m%)^ soz䭦nuV4ԓR>g sqv]JĀy]@)IY yapIASuD֜>O*u0FK" gp gp*vI%&e 8 '\P-q1=\T !XJ+Lj0|A:h(KbʅW%.!1pu"'1zQ.D(a .;3IxPLtLX}U'c|wD&UD[wP5_j }//NzUH 2Xth~-zDer:J$ rWE )HQ{%.Q[YrGb55R$BXRg[kmb JJZ3Pp2Ƅގ+-;2& G.Q.#|_).q4Cψ#j!BД_Qw JhNlAٕNUC@#ByiIʣrRP补$!4h։Zg.4%IRP`(yp`:,RСH@©(kMZJM"RICȤt<N.Cqg3a524[ۺ٠y,KBy%KR ݊el@E $"^`:9lnD'@^5^'3I5(pVB8Ĕi\Th; *T:8E`h.Mwce+ŀngrm}^n҂ L/4^ 0~{ϏL(wi=08觫3 #L66@_ sO߾A}?8j7oSL wh/ C|>Op2ET/n+Md B$/#Kޙd~,>Ңz]*8׻L믠iuܝfTݞmqw/57deӾMwEs@tRm F &<R%o_m q5ŜY?HIXo5gKu%.' }فT|yt(EF' H8myc-.Lyք\3!Hk' qG_XKq DzmSSK~~ qnuu!^KQ8d#3)fTqd3&G%Mă0Q" :aLEL:@EAWT.9n-#I D".G煳Z3j-!$ *DZ۔Px( S7YPZNb4^,uo .T *f&庝&P)w[Py:wO^3t )e ~y׫'K~_wU?{OVLuGL6M4' s;O NvjQ_G \+.޺ۋ' ?aUA1Aީ.(вy4%*q~Zd޲vrvNu;."(+clբ MiSḸ/e(%}R\qjGe4I&(" ?ʏsDٻ޶qfW|(v +b rtqpoŋ(tm|!%۲c'-;4Z%qrfW\lb "\Ts EnB^ϰ7]Z \92Lfs m5%.\tv8K43I@}' h{~sAa$%p{srLnfZE K_3@( GQGurH&d/#Ցr+I;l'@yCwucb\Yi3iZyJG\+-=[D!g(V!<gB#a,3(0PW 'iB:7<[r,$nԱYՉv|H iI0ۭeC-{ݽebv͔--+,U).V*UrGQ7l*>`1ZV^ME ˱~aNc^uYs  UJ-kWk;ȘТ|M+SbUZ?)Y3* J`KonY6:7zж `O:ʍl`%<$Sh/~ҹLMzp8h`e[h:7~gx^%~1 :<mۃuK9',Nw$.=c1i`t2Sg&:{ۂge<`@Х%*_Y3s{Uj}AWƾ}+uHV^O|.I,ɆmkZFZ2TE~[>|r hxh.ndoϭ5cj7NVF6 i N^TХG Ui.܅؋ ^",ׅ羹zPZ ʜڲ۝@8i!ԣI62Eسz2[tyWsiwm;;i:vhOPh~ \Qq73ڽ3{|32LVj\ނh=,6Ҟ٣ 'St".kF! ^e?#gt`MN` oN~Ѥݤ'Tޙ7>2Wgy:;Ix>\mO2t~#_#_oÓ@4D9'I#˩f沓KkƈC-3WNz,;9_xW:h0(rhPgE0r! R@q_04'$ sxw:^|ϲw=7LRkIϯlk\@`Yb蕥\XɢutXtUI;d,Ƴ1=\Ӯ9R6դٌ*xld`Wg|s2 eoFC ˶M/b:Z`v\ȥtR "l_Q1Cһ8۟^?<ie&}IIU~rggT17WrHv$NM<,*Ņ%aJ_d8Gu^!Y3JH,HK@ȓ>SCL=l` $ c(ںװ 5N1ŏY-EJCv-W}1T @lmU8MU:ZS_zcm~ ìu7K!J\nt7xɢ3;L?{ &cIws\A]3/340[DoyZ PaY /ۃqVѨ >*A"lef9K2e LFÆWC6JPuSmF1Y..T$;2=[{(哃wىpݫu dl'j*/䵷f<Ì-. {X4h*Nj(SlȀV:ec@r冀|*N7\F{tx#sh+ŏllF&T(X OQϭ.=Q}nuqd` RN//)fi0|QrQrĦ&^ Z!f䖄ű V\jQ7V/X2r<Ŗ2BAQG@`!Q( |aJJ" :B.S]~B`{rhӁh5Ugz4}8F ʸQd$"!!8R܄id-V\aqD09 77'ǏS6|]\`~ /8Ne*;=_β1R{ L`D!-7Yԋj(F?߶/Qt?ӄjRPeqz:RbI$ཐG1:hL &q H!@s\sBl1#,1g TgBB2\S j dqnW:œ\%7md]ɧD҂jv *I}v 9'Wڊgu@Ty))|#LA3 )?A~\A?3<ރJQk4cVi7wT{e ;p-([j*l&5MNIN{kL}+᚜VmJS@anθGI 8ސ$O>r:1HL 4-A oJ-r708桑Ə#x 1Z ڥ*_ֱmdx70WBo:+y*!6~|tjpy)1[_GDiv$gK{7 W;X-zqMVK D͗ѯn;łE /nF?sBeYoOvGI+JTrwTu:~"8hL50EfC3z{-Tc;I6l+cp۱L˟=x9Sb{R)3a"m@ӧ޹o N.ƫ<ܹq<~{UJGA#ڟ[vHZmgYз[D u5&AhFWzR;hq7InCh1[lq J_~ &_ڣ_eݝ$d8wsuk?:*wpg3wN ˻oݘ!?zܶcc/'(4?\ӽ^=ONNʌkL5.oAױ,n=,6ҞY-^8i0jTYk֪v>QG{6O*VӑNqWMMzKB5y#sNy1}ε $C@@_?(|uwAg|?|*ާ g&i?>Iz7YNe0l!,KkƈC-3;;SYwr~Υ˽폴9pՁ] R:+ ԋa岟C@a`hx O>H_\W^<~I,{st!y΁Iϯlk|%&K~^Ye,9 S,ף猶9u7~+ODaȆ04Ӿ+ `DamJ5;Sh!ƒ/cJ1agR?eA d@ 2 V\Fy1D_ۧ_'V r4w #hf2,wSjbR̮./;|kNbdLPua3{p)0Ǿ/ "m"EHEIUEQ(E"QB9pN4RGX2h70’?f{Mi)}Yb+kQ"GϞ`-Q, ؛[y>$^eQdW72ۛv .;q{x͟?,_5Q@!DDp\q6CڽA&j=wVq}v_3S\ytlww|^P|xX1LafnߴxƆwrJ6jԵޗ*v&09 D׌GGx3;ykOhs<*Xqc.%d?B~9Hv?^/1\? <)^6&or}{~9Ro{9^< w$'2%'Ɯ+dmG7若#^VMembH$e:>x-ҋeI:)ަ Ǟ7&g4'h 2$!y| xWnyE8e۸؞y X޷Zj4i5h7>P04nlU]]q~.w}`?8di=~/re{CFQ*,2K5!NQNg^?2P+]@:+Q0kqLj!Vyq$ )6)SLr"AD"̵k븟̋6ͽX{>;Qe+=Xlޑ  CME:mQ[MD;b:VJ"M;uT# CUsaO;뼞O ȚNh/2sk<={NUR9UBR.wV0p*Qq5 uP*qRJM%*GD/;3j8o5"Hri9M*ZɆw H"%$C{#+JJl/Ad^(GQ=""]+0t\هF5gXT˳+ҵ7R?QGsHmݞxh^!uC@N  ?>GP$zڂ.J!=MG[Hfهfz$ހZTm~7LOlîp{ƣmxvznrW/UoLNҒU9(קSYTGyVjLPCp@PL&lXN[=UB/.ϷDw/Jz ?_/3y/QX'K͡$FX43]b܁Hרla܃ _CHIWjV`xu 3+dz]=>)0JjC@>-"F~_\cl|*_VB ߞ>C'ߧjSո3LS;/@OuLe*2_܃,tFJ0 #k aq &!ouK>%Tvb'RT_d@KYR6 >_.ݧdLtv.cn#(ҮʭCf`h5Wqa r`*zW m]M[-y{c9 V\KS-WE;͗/\*XIiNO_'ȓPh@QΆ P4D5W蟂P)v}RTM S|#tɯWB8@T3XHN7 J. AӉ o σ$IJwt돹dV-a6*GFC,`5B"2K=FD̶˵XK&8AۧRc,!4E]0&\vR{4ÜQ))۱GlIe*Ԭ;k5Ϣ^׾)kKQK LX j2tlp{N$buP0ɠFIJw`RiV-TH b4[+pboB)X.W!rP޽?|5<Wje w;z}6!w߫<3!CGE`$j fc4EPXi8^bTqacTh| ,? ls(Na=%ĒBq^ӪE(.V`»8k$e, Aa`C`Zixss \齰QxnQbNh1GN׆L8T@.#!HOxA!ayx9R|db3teNҩTfO'QBT켧8rSC {I9i" TtiQn؊zzJl!? e/jAtKkeytg+2RO}Otܙt*DRaoR cR-:v!qsYnt< 0S7xw@2LE\^3+1U!,]` w:J-bqc$QF /B()97Ev-rޙ,<IP_@e4/K"E1 x| Hj?/Tq(_L# sPR̋E @ 8/~k)D9ldrL9$1u=ZIγ.߯/%ה"B/ -,ZK =‚Ao|߆$)DE 5ei0$LvXOއ)F\Ge`x`yRw(._Ť]|U۠AIΝMM +J"*zK=e!/=aE7]vǢLUsq42@_>L'kX.2YfVWB_'m]j 3z[:f&JlT)5Ag@hY0X?*Up ĘvIn:-3%iz I7i&m75pU&S,v-Xx2]?ix fP'暮Ra:]ݾ/Y+D~Jd⵴#mLVhL#;nUOCF)_k#ޡ 3vÌEx7Y3[VAQ{\a!ߚU ޟ[ƟwaWP180: P! *L CQ)"8I\r 6ǂ.|?Hw5ZIq" {6ݥ &0ShDH 'IV9a  ނq%a4e~mN9)8;`T.-WL܁5#g&TQ:)~YEI,|p3q 0HIc'b A&.&0V" F} L& 9h$!E, ЁT{B!QFa FN:!]Ȧ(NPdEՋێe&k ]@`̑6,҂Xg@dG47c8Wʱ-4Pj k! (U OpOj:ڏ:Z@uLEM z K̯q<A ~ݞ_1a.ca kp۹ l$DP#!jD.B"AH:XRMW¶ᝌ"I )'W8Hb$)YMsiTKT9@Y9F6PE \%460$B4١9)ՃlOy(kP<6)͈fh#-c9^ WO\SD5ZqeɥہDh ~RԎ6%8;#Aa9I8 K"gp_O$zÃ`2c,I!3x$sW_d iX"e0BB8Q'SRC: *ʳ:Dgmrѻ+yP$iQmд/- ΃nd+i{%J,CRp,"kc<,RI!X4IWMz$dl ,UvGR3B9Te8 ,˜d.KbJ Nc:ADu$ )y[B7;9rզkTNjPކ*TB`<ȧ#ƭiQA26t7Hb^06I猱S-((e0 UHv~vk>.wێW!ݱUM̈U\L%"(0(^S8X$^jC(2e8G+pX jA4$h9!2[Ex9 hH*1>pj- vұԒ32؁ÒqH+1 ҈H.VcCŒ{R0|+;1"ktsUa,$F9 < ?9F\fLk %K0`9#$zZD{Zy0@Eˈka`=+ж!S !wiO\*S ^EGto~鷓ë(,%m qp@3Nb~1ƑI bo0=/A9/2&OFIjLL8i ~b[E,R!ڀGQcFDd"Cb"^ 1zp>}"< p\=&(Pe=E^>G2`fLm52cTcX7݉+vEc '|/|6 B5,; QueT`47 ꔒ2:j:HD+NE y$#+phiӡ56c+[+hIL4G#/y-TUC4r$=5DODGF UaW .92$=`)`%REtڝI+Nqt8:-)"7|WSBj}wՅlzT*pjx1_L'bDT:X=9!e-g%SHn 7B N+T" NQFlxeM qg3qȋlV5U}stÎ;Gu]i?8\`XWQ)zrOSPNQIT팬XU⃬Z?4b̸kE2my)XH8;XO*6zczTagbHm4RWbgo}լGw) BotJp.I"Qˢf!B2}2"wZF(]sA.n]Ի׾x:ú85{b9/A5k;Jj|y.פ*Js uz0)q&N_OA#[rjZ.B)]!pt&Ꞵӻf`<:`z$P-i#s?KtKb,~FąTx#Bㆄ{앶ܨp qSkcR`J="LgHf6ſ9B [3?;: ˱-*e[1.XETA%QÐQ -BKR"r$QB*[D$489%+DNE=j7Ps7::4gj{ nҚA@hs?_ 6^,xW̖{\dӍ.=aNUs) ?!]_do_," .v5d.h<\δz#;chq*ғGeH$+(*lnF9j+  Ëytã##0Gg/O2k8kIJ( u\g-#ֲ EL׬\*Ҟi ^>V~>3y3n~ \<\=Kn>y =Aj^0˫_.a9,hb^taI2,`b6&n:h}mP o3M]iPW5۷#gx.s vǑ,jN5*> ©<)Pӵ'UC<\+n![ׯ ̨|ZZRGCCD!َ F8r((&6J(NŒxywcIqG,i>K$:UON QZeKCQ,wU``1.rV§ 2H)Z|TaxgHwLQ~|]1~a( SoWT?B zT|zfBFJË  SM7=Kfqlৈ\:]h^XG@͔$HFnz_f{sc 9IMZڡ3(8өf`G28+: @ G((_3zSoſJp##bAHmяǒTEv?.t QLg&Ʃ"&RfXB{*҂|~~e^ZM'TL,&ik׆i`և_q8B=  U*1ni{_+13?'9A0u:'03xRF ֞H2Fɪ*-e 7g菳 :QUfI#﯈ؕ4kygP5qlUmtz\%/ɳ{f{*Sq}*vT3 "wJycۓ^lٓ"uĨǎ i^TT{1IیQ]֙ŤY/m^;t18ble'ÿsHh|t‘Ȫ _˻־;Ѣf@tubIgæme(B'Vv/hE yq,p3/82^6c.6)}/{mEMj&_DٗBsAsxP [CmՔ?o %dj 0//!(`/UuPA^{]Td62 j2atJLإ\y. 6*A JheleD]ބYŭ]V Lb;-תm8qo,PVȩ} O'">JQW" P)JR'(п2d:4S(B+0b]փ:quhL|Yc x2KƉjE.u sV2C0&IʽX-fCh5Rס|~xdnsyo}uDߟs2ЅB?C޼~|X >er!9!^xV[JlUדX+@ѐIqΧVpy SAP0 KTQ"q$ YG>jp~AmvP(Gd:bڑf:J$ShC]y-#)29NH"KfFyO; GX a0L2Q8}AtofK≲D9'|J J%5BV{VKt+2N1 S!*, ohuP h&IRa%v8"9Zi\Ԃ*|]溴wGPޙnr+d~QEݍXRۅʐj3#y=̱e{Ƥew'%HzS%2xs|BmQxOgޕה$@~?ݽ96dEيINj3pwǃӇ]w~~Kpe$*ytĭ6D ʯ`<5U*pT(#Xag<ߎ1>z.m?}XTA-Ia0 ŌPϖbZJՐAU C}D$6/t G- ;'^ f"ML{4 ~^h?߲7bC 7w3 ;O$QR,r֐]ˑl#zN6Ny܉ȂM=\zu_XoV$!_FnJsn4":Ϩz$BhصvFvkCBrBIh74?K*:ٻ޶qW|zgw~);X`g[LB${lbKJ-bJ)5mYy۹DtGE&uպu!߹)f#9U噩UΣ'FbU2rеmjp 7op_ ả9Q%B0?R17.9VW.߮]N={9f,Zs:DhB^6 9DzUzxWMxWUz::H jbW#H{SSk H(k` Ja}ŭ!./MPILZY^ s%4!DRQ$yc"v~Y\qAтS EΥXiSD"C$7!m}o:ۯ; iKApȑñ#N:p,p[+PCRT  $)Db)$MqRnbLQe h4θ=!dݝ7{m^rP(EьP ]ȌTƊK3  cT{n'9D{l]BZhjh3 "4k\̂BdЕqA"J}dwz-jG_ \.d}h3УMe@"g3Ѕ|*Sm#; !8O_ \=hp*Y7oA;А\E}t 31ue}k7O &F6Nvyց|*SQ͞ w+|n2楝cV:8Mfu^5LQ2Mg.X6}aN!˶}瘳wc}Nx7 ϹdE9boxZ$! ${g<MO_ }LiҲ.db>BU]6W5q^[`r{|׃+2*+LTOݧdԚ2(N#\>>ߝp4|wVBMxU/!Of / uo B9q~%=+FKm9EttwMHAbQ]}o@PeCeziz:P| ['cz@6e]}eK|ji-bxW!ѡL>M(!pgeX׼-:ięĝ$wj qu0S#8/*Ċ$;L4=F"'8[SSԼ^PBNs 5y4of@X6Yz)Ձp6^A$>gJn;6%:А\E=tJaŐ;Π`k:.$uշE(Aٴ%uj|P1nA#hMA@4DD W?1CA"$GtOP(Vq8Ab`!6B+$mut\W [XE[۫sdKttwwFQί)` wW.@0H90 Y78UַkxQ6ۋٯQ6+h]ĝk p\㼡@`ΤAL"( kRL9֔@x`e\xZzyzo`B _%MB@HJE^X3qP\ ,6RK 5#|+U&;`P13`(FS1c9ׄA22J$- @\a}# ބ0 /J688-HAqtBCs)69z׺q*a<1pR6Y]Z-WPgց|*SR~Źc$鋁ĺ beLoA][NQG.\{!42qPZs2]pZf9]IuKo8 x1ґx0##ZstDཏ.B>J"Yl%d d+kRJ# CiCPu#sh7*c;zL`D)+{>7>H$o2ϛd6<#9(<>"@# Jzb#'Weo~NBΨuϼ,-Z#*F|/ D$/#(ell2Yt:u.yuAA˷'*5?Y^$}s W;\8þZ)% $7.^!9D3\nq`%A͉7|m/~F2+E1(B&Ml6w\{[S*`&9EB2C*aR( EqsʷV%86nkI/\Z.^ yvS7"*[;v2_$֫ʍ{_,OGvS Ȫl~11 sσD܁6^*^g(Nt(_(mJt7# %T!$i'/PqBE ^ZJ(-aP 2 e*9¬ںVV׋${oT=j?ծPY&أL[Ny .ޙ]bQ" Irghqh`RD9DE _FXbZ`BpRyeS\C <9@J@jJ-`9'r - ^KsGρ:z/ӖQchꈈ@\1{Dgt{xftb5{8w: g,0{G] Oy%Ŝ#O|]pDD%x_]Qc7s"h]C [AJ`4=?jNtmHx*3*5F5u7۟;t7z"/y[rl+WT}0~j VW -W)DKAx:{'g;x, h8T@qvmN//Pk/T@h2ʐH( 2 #1!p ,3 jm#? >i#"^w=|zMx~B KXφY+~VXOiq/ݷJYKm:۰lNs̲y듇eȗo)₋*3=@cI&1T #H)dkwM}tPރl (TܼsK%t”IDLɅfD9:MExR$L8T']o:3 ̬p`TQKx R63)4i(`rJf' 3;i3;,g];RG191:%3~kkVOn5,۪}0*һ]1Ge](=gGJss>ydaA*WVVWjGf55_&rׇkf@>*;ӚZ]۟׊imكYDn';RQ f=*^ԨvG򝫨NJ4nޫRۑP8hC4TG,o2kی"z;gor}7O/O˗\WcM/輪gB`B߁WgFgrXF#]¤ow|;%<>xcc,0hmJ0EQ +3LA) H!ٻ綍$ry?Xrk/YpR J~=% )%T"At9iE6C SϿ tҼ{|3CN|Zlxk猄Ѕ1&<Ä\# lVIF񡆢~qqȟ-5N"d@0-89ṔsFb\ZUdg'!tx^=Yx!C R"5XIa0=U6R sFuJKY$,uZ} e}Z%Ӟq}B /w;?U öls=qhAU]zn%3SBpjq{xv:j'77mAvlb!|"zC٦qDX[_ 2ޖK#;ONY 3:X>W +FiKˍc.A ]Hu>).|j/*=D}Csݴ3vTNWJNmEK`oAE{VV$j%c_ܞ*Q@1R}'Atd/@ bKk3K 1ѷRDy[7B^%;PY׉j8{e hѭzPsFf;Gظ}O IPNdJiQȮ[&S|;+M(SZ)R[V,ܼ#E))ٴlT35}DzrFrJ 7b:7j\=GM>xA˫BꜭM@ƙL`@08SN9`Qb Rr+5jùyQ%(AT.$["L56,5?'Hr8;y/7WHrS|Ξ栝{9xʓ:!SA s !+᜕Ő )0bdB|h7(n<-$xڂgIԪkX e `)JPH86<:=Y pPR/ } * I'bxXBjIj҄PWD$ "1RbMg(µmr7.{o`)Ihlb & U߁v@8!A!iPr(fzkT&۹jBZxi2-K+Ɵ Tz@Cy '1mO@cwZBA ںզd@꼚4w]''¢D1:n4I(0gtag'),hj2*(R |AIOH攝U41˗Wxvh?tS/S_clsƈ;CK7[o88;^=iKh=?a#Yok$q3CņDoIY`}=m"r~G C & .{b B{>Y u,qfp/ՎIPw OQ'iBZp,}nUK`p}L{$ ~#oCۏ>aeKfOv2\{Phtü>ʩJÊg{;( dI#ԯӇ֎8F+x[q}5APԉL8kUhP^ v¢+¢.Nr3|sgJ9]8sDwHźdj`.Q;ݷj{kIxh\-/_|n[k2B 1CGX -/,(Jf-]‘' L0rLA a(V!v|q %yGm<p@. !ZP:O~:` 32_3Y*eoUB >rf>˅xlL[?伭v 蛃t{|7N~1%\>\tbe}/`@bߕyJ\L^"[H}ial[[#0a8na!xYLnqe$]>iHך@+ňP(!DHVmԢjt<'WJB~5 h(9:RY:xiH险Qk<CwuqΛݱStYse}s?\ [U}}AÈϔDtu E'icԳ@Od1;{qZ_q2C!ipz"ZG* -re0 5+21B/yJ)H["NH0J+k5I"X'i,EG' =WS}PBeNCeUeUM0.r WuA]h`Ӥ3K]}Eg)} ?o_e=5ǕZn.֠sFKM/B@7h/Cϵ,Pޑg-ouW`hg]}UjnƧgH_Yfj?k,>\D Rh!":IZ‰[qQSs8̚JJd[VUFI1 qegӛhunخwgkvg[E/o5qs ri]'FҩoCuxfoNGͳ /:P^l3K3Xne@n~(;EL\z rg.A2Uz:qoY7 ngn<":cԱnˢA?!/wN"$eXsb'XaF_%4Gw.u3wkݧ(Ny77a/Cyuj7[ߎ$F;gGJ#~ 6s`v|0N2 rKуIgNJDί bS*KpDJNSˤ"YpѠ\+͸d`vʹ;>fndkʼdjzsVv1ZyF65 W݈ "! |Aln^01D*Bm+y/2tvc&͇{l#勿f 7\! uI;0YVaM}5C>{ As1#*[3YCond!EnȊ̔CLƨgfLVo6ɘRi@i@uqI֋s.j8TCJP=3WCK]z)K:$e^T 5pfx\?.uqXKF0Čҝc{%aMǼuR{f}]2Fޣd)eبA,Q2HIMS,{Wd8SR!^R?JZ+IL :Jփ:bAI!vЮ_y?L+05"DeS!@q8q\мnǗwG? =@Bccc 11b6 alp $_&8f vC\8۵ή9{1 sTS9֠m24RD7HfSR-Bt|olGB$n% n'73} A?rq+ < E;Wa1!pT,CD(D* xaDBT#'L({h,`%{䶨)"%dRhLĂ%&F4*NR(11 C!Pc|XLhhka +OK ڂ 0P#iS͉|\Ӱ[Xɭ2L4VPBL@b&$֔8L,, 涢_QΣ D T%FDc*&Jڀ ÷("4KԶ\{1,;xT1?D(F #1F2-dJP؜DP<U=0,8#oogjRmc1PSW 8%~v;rzcF= lE0&hZ?4f6G |dውu[j=R3މf`@)9"ͰZюk8Y`.Æ& VɬdҖ EZ3kv(._x}I?1Va`SaBaÄa;<ņIddU)2)fKJ"ܺlܼ۷JŔpVw/ iZ}wX(ڗ;AnID \(mz)M9HuS9<%[k9\KzU_(Aq}HN2^Cn #;Vx`@6\H%nn ]n >|ՅZ *iz3,pT˽ )א)R |t(ulKv/qw ˾SSk0}Uv2AD QIC+i =I C)0|2?A,~Ӌ+*RcAǁM=H`~~ l_ܯl A^Xk*1mU~jIM >K} pIk$=/I"HG('/5;Th=~%sfE/@ x1QȦdXpۏWnXJtLC'!aird#w$#Fb5fjDp$D|oZ#6ɐɣ;B-evn8xXR0Lva*x)/0&T1j<$C"4L_HUij. .9- 5e6=TD` AcE:]=~5To'HoUJ̨ݥ뤪VR;1Fx$$HLF8]⎈5 8d=t& T"GumLejӭrV@Ay ۹%Ҷ^Xz{Wcefãɿm/[F>eIk8HpVC~N]J'a3~>PjOI-JޱWi]ZLؖU BZTwq3WJixȔwk? s`8i8HMl;v!TQkt괯XQT2S-y#DQJЋ2D~p[Ci)<ղ >s4%nQ~N=?k,5ZїFN]q :Tl.V2r;\6@T;~BV~`ψ,#14)S{^{F?Xt\ x|SR_K7|fLH^63sZ^v2f8 %"9O߆7(Yԃ*E5!Jga0o8t])A-AG^l-R*}lnK #O`;pA{~-Y^8ba#m}CW "&ѾtkHLg,q`%Wq)CW~javTʋ[ТP%:jR' Z$99mkhX zݩKǯ,~5"@cH8h2;d@|*)ON\x>JH09vޕjO]ט"6\<;B~O%c@J._<:mW"EO(6tm#S?b6mV?ƒ=tB%SGlWNha3:ZhIN-SBi~Vÿ+N)CyZiOs*'A(҉xdJ*;N3Ήg^O[E sΨ6'7ęgJ; nSc|O>ǟ?'IwfSt9 &=fkƴFƻCy\F;tAbt#jSО?Gy">Dp1FI%)pK :&K\yܓJڜ)n:l4zmk6_)sx5R1BEUd-cc#vSu-H_d;lM-)8 <iz>CP@H^o-hzh :g}.G h8H5qʰ vkXl{{9*`>Ϟ波sFch#TH1PKu¯1օ̳iф5Z#vcd1@!K s}I߽l7DZ ``]td/J2 Z{hޕUFX*oNp4;c 6@ʅ1{O}UzJw8֚r !;mLܘ Dfmiz0Ŕ ;R rНJ9xqEqEm M8#f\w3ԤfL ˢ4̃4,-C ' 㖈bZEރ$c5ROa]A8BlܹM=^tǯl?jڜ(> cUS4~4ux9RDdm&HD8Y @S3!=/:HWdsɄ_X6mm.C3%1 $֣cަiߓ9AXo݉ '95ҌKblt͵jXéQm ʺ5ݍ)o+x1B@'e_[C &WOYd]|{e@e=>TWU8 VEJ[>3R'vٿorİزK/wG0Cc ޷J`S:n`)rdѦ˧2k 㗰OO6ī|_{/VaBSÏ\?QT޾>"<}ɟ7?\MY^գMq5y{w3AOwAQ,mUj7-l67ߞ,nLTp~5y gnױ陚i->9rZ:`(sF;jEP}qll ="DZN׸co1ngT̹HVt'Ehk%,J-Y~?˚YC[LeSoځj.vTܒͬL9TqIq=W:|YhU_O~62{1aeg4oBEO?ZaƵqD+Ov>?ػ6$W|u4U=L Gx HNJDHvFQ"N&ՒwjH!Gc}aʥj桨6h6Vlͥڣ Ȝ-هYfG(u\t 5#R1:8:$D]:ٺ[>e\Mzְ7l䶤΄M#"%#-qmGqMx|N-6郞|_'b|yq2N>O;fpTn˙wlsww1TJW]kբKn?/nھdh2j!{yU%*98&)꺌/,LLbB諍'Of~d?V[@&=L >Q}=Rx<)14g+L9C07.@,MF-.."/*7)!].ca/Y6_ouo~gOx ;H9hVB{AaEY )ZXDafnX s4U,Ɋ|K΅ȖדYm!ҩ(E1!7zg}9k)*&׋XRaog&gg4} 5A+ --4 i)BNg\0"U_ʓm]m/|<8)[C h 04iD%0 :}WP刺TѠmAcF࡬DVr6O Esը=x1U άzL|o]؜ ou|˗xĿ<6?}' )^E  T0b̔$?t/kk}͵[3Ƀm/=Bhޞ Y?.<,׻~X?p߆V[ސO\"ͼ-`֨BŠAA!+ڻƥ_x:ϷU^(B圙 "l0 /ɱ 2qki:ɻYRθ&,3w2Gv<}f>i7ogyxeES9߆٧s?Pԗl*mBuTGxY#pN^]ɽPP4¸sG esZeNKHT-8s`0j{v!bH:٣]j+T(5kϭ]4Qm'M3PR`!s Z%ء{6&VnJ(5{@"5ppD)eT*/=@ˋ󇛱  8MEzĵ׉k^'r7fK)5\{iD{%2T`ia2Rl+mbt5ޗizF]ޖOIF 10g,k׬s46}7CdH凘ڑМӵjL1"x'S19$"U@(\`_)(GD 5?3@ĨǜA^6Tjn NH>y'wf[:F .\EiN_M'n(tFGs?{*|xp?ggY4/w?19Eq}5ՉpSjOsiBu~8[ G:n V3{@cH' b,eIKDE*pMk;DEIO [NfF )є53EdSt=`v2 O`F$;b(Ih)&Ho^Z]BU+s2R8/#Ʃ(X<<_J޶kW9Qٛ§_o'ϞbOgON'F!$JHhnIU9}TH׼)}h@rgf}z5 w`YAs䝑9l?ԊY\w,gx'Dpw^P̾j(ɞf(˞xWiПɾ\5ђs |Bhe5&ZJTwVJN䶷K@Hb[Zو[.EO2(Dvbٮ}9eBv?8oN;xRsy͛tFnr3orP?y3K4b&\Rz+ӏr%w1df y i7%C"mB! q`EOOǟhڅnx5Ӫ'{E@+@ђG?VQWc޺=}GjhQF)mY[;Q/-{7.p?;ae{ۧ.Cp2yiQڡѵq][ @rFt/CE>)=F"оC:%]YևyցEȥ'߿ Q3;kF N1P-TۤW)qlL:%2C:%ld5ܽS wѬ1qr%0b3QיgN72c; urڊG4?2Ϲ&i55l`i ?MD9X~C=~LKv?\O͸UtDg)A rN 94:;`ƴN 9fv\ZISSF(;p,(bH`Vb,iFFϪԣ`QTz- 8d(MLj8Pq8VS8ɰ X6]տn:fZ+`ˆe:E9Zu8FkY[,rq^ռ-5k啄S}p+u΢8҂ %yeQD 9X4Hrй <`e4P...טd{DsW`i~n?ۮX`ִtIuӯ4ROMpiٰg8?p81)p>X W( 2};e@!VX J1TRڌ\a& 0lȺ+y )#?Q#}H(8TtEW,2ʝ[,|hZPp =#P+1JDFP',H XG3 ! +OBÒф: AeL`a!& t7̢ DTi`& !xEkDIz65cj7U-ZҪ[^_r!_˧* ( |6$o'2ժF_A lzX}z,?#.?߹1/߹_y-4-sJPMϑXCg>O0E4`' :j5>k@st[GcJsK`GeŚ 4sah 4k"]ouBW8KA~yUKݒuP\Wϗ(a\aGIG1Ic`8#nǀGbaml vC嶵Ua4 qJݰDJ˶YkTT-cW=>Tkӕ]c$g󪿟Ƴ.G(?h:ƪzE߇yAa6 0T)m˙]!C:~^1&(ho%kIvqqv7wg͹gJI}f(C-qȟETNt:s[SN1H#kOҭBs['gFLM5^htkC,ڈL 2сiDf0!entED C#gMX%]΋@WWTJ߬SA Ce[d,! QI!FG)cᣉ$7k0k+)tMpEk t9 Vr#8bI`S@qU {0V69gvl%7aD8GJ" F)#m qlF\4V: /!Dc}]60ukq@tDUN$hVa*UHMg>ipR,0|"Z-)(' p]%™`bSIF {'k"UcX x]1 Vӂٻ6$W,f+0A=lK yʜ( IodTQbYۼ*3⋈ȈPHj0e$A+tLIE hf|jG/94b !&F2'.J $"VH}JD7o:Lx*ud%7Wvb/w{F{Gܴw?p~k|b^P܁م<㋫xa~ 7vv6]NʄߣQ}ҬC$N1ҌpUnدTbһ!sp"j`|fm Kw'XNcX!'ܕL #K{j d4H<UKU#'$H@2 $՝^ Qd%UjXAOzZ+&4^f駢m =(Nї^!Gq{;½Bp}%WHIBQ#Nz Q(%ѧv% G-19Q܎@%ݙc/0cN_X:Ѐ$Tqn\ Ef8ۜ|(zu‹iM<\ V. 1e #0_/!j.VYQ@m (}1b6T5/J\HauTh)#g5Bwfԣ =+DofznoӛY~q[g|oS>6K\51m Kf?)Єq 3>fM??6qeR]P'⁇"*P'p1wX5f@4]4y"35f!#5;PX=kt~/xe<ҋT@S *d*eRzHq"PB 6J) A+n$V%"FaQ PhbhFDVhU^RKDWV,AׂkZmԈw[uN<4Yq3l&"; ˇTglǿxOL:T Bo)؝>~R \Ow?ثd'-PUeQl⹬wްS)Eb^Z hbvu"Hb jՌJNO,XAbiVy#mmו;PGK"TquDb40AC*wL8H9͕w!HvP TLPVOBؔ[2=:e4DJۓ.jvf1z<W}2qc|D -V9u`蠊 cזʮ Mwս9:(AD8BvzX#X6 >3@p(:P kSDsr:1g NJ^5$ 9N"*'Ru5n9jEzvdM5q ;y%"KEU"R=`4BvǝPL: ELJDDA uʐ:t,T)ؐرcs.:Grj٢؝C{JU3Q\Ґ=AlcERFNYj bXJv PL<>qb݊JO6zԙ;h*{&^z RpXG0Il`uQ+\6ۚ)ҸKN~!^hYF]>?>?vW:b_&f Ihe,9gWXJx%ghYF F}CEv;&E.Rr@uEyob/+cW%_~GfCd F}sY9mpg#GWgvu[ 3\]hQgt=C%"nc p׃") ?wKCPŽbqr g-Xdd=䃢 YB^A/t?x悷Tvw"a2hӠJ+T[P$ k<² cܗVjDOKQr!FUz氈obo"&jQt/ф!\mϩ)_4M?~%$ASH׵&@)y}hC L~]T;j|'Y4/h^dѼf5jlHeƒ\p@ADPѧȝ!2F%2jr ]Өqׁ:E4L]w W_RNJ7l~_WLhH tbV7o]1ۨ`TzTG|sj61k(cB8N-4-w`BL #B4$VK&Jsx"Նbc{ΡFu5ЈwBR[~y(vb-!F4D(@E\|R[26_ʈi R.ܡ¢V\_qTS#s UQ4zE=20 sW/yfדInorD"Gw,N}(TP2)8y|4Ҙq #E|ĉ9*{&dqezݓ O4_J LcTK%C<Ʌ$è WGJ+No 6ZX}{F f *0Qkj@ T\c :+fjA令%_;KVE e#JbJ+"aQ)l't e JøcZZЂz-z3f0m|1 8I'&Dc_Lq@ed$IMje D˹7nTS{M+hXŠa3 {J|!z.Ѫ%9I("Y,QM\$`60{a1Մ}_:CR) .W󧊴Q_JɣEGSnI > @ \4UEtlq([+]JFl#qvބ +>^ᵛ|/a귓ю?-$^hX^N94岣kJؘ7 sWC(A".6D{{ !`s0?4%a94y/DOɷ#@qu@ o -o1̧Z{>D>-iY1y`ù?M)h@ ܟIj;(X5d7prWֿǩ˰itY'_p@[&Rpw|?}~UOɎƷ8_5o^SWZ}^ Ȏs3XFy&=~~ܸj ,MXשp.LA`/%ϻT^u[ H>x}vǀeSTz H8Bpe[{Gdr^'/'cݎ%2gj's|F$1)Vd> |ٖBAVqr[PQ]>7ȃŽlD?YߎZx'I5g;tVF2Ԗ A3#a m8k-= ;:@9[hoMr'v8QLD"p. (% |}{3rؘWC!@ "H/鑙{=X/S,Z$JP/˔xFq[L0)UΌЊ4e)7\=$)0miMJ&2ږ:myc ͽ7MD>.Y+yc;A' {c8dziyػfn3[̯Yf3@g3m=zrVñz;ߞ`@oY' =+nnbVEךANzk W+OyBx^7)3[ܿ]ajRf2qh:׳/мp8ߛ'0"_&:e)]6mΠ nW<9돓 }yȊnqpU%onm? L_{8x7Ml -~`J`+F}pq D oDpK7"Z#Z?JψJa!GO l/F7xnS.7FSa2xdWwPt܄$<FBl(T? Λ獰|)+Z&V[ewCy38~ߡ[udΆPʊю}CQڙGK96Lj 9:3g{Ρ#ʷdKkN!i-/Дd:ߏt z,%FLvٯ3[?wafKbfc <1=Os&Lf\)w,sNL+6n̄J'ӽN>*Iƭ.vgEGYsNEJz?ﲐJ9_ߎ.S +oycnBc,f z| W$sIW "d[8)mA0]T}%YHO^~(kknH _\ۛJUvI%)X\ӤBRr|OHJ  \."|_OOLOPCerǶ3rk8C16]?/=j8M+/c, Wzw;oo9~ȭ `w//d\rXTrPuo~A*r]bAۺm,(m=h+pԫ A e;20-z06L낌B0 6LqQg]"&?t{F js&+' qLv@>(!L>T 3.q,%S<0?[`8FHJE,r`XS$[] @U֊7_/׹h=_R蠋a34U:h#-~bhI0阸L4T np~z4Wf\'6=h;LjQ"i{ۇn佺' (M0^0gw.o*UoW[5-jUAׯ"|bP{|=Ofge2.)-}~l>]fG%}$Ьe6I0_|p t?k5oCH%C|`&S$MnfP2H O<$[ڑ8XQJڐM%^tB&& %j,,:4 /v>ξ8h>WY|%OPzF2y6L ohO -$Rga qcoizΞDF8Q"m{ƌ!I=u;v UkM;d)VJ(nDJ*ww"Df bxOl7 Uo!4)cǴ%V P70jR#ha$50&-5>F|=EZ<ϷG Ay+=p ah̆Z(~;q.e "(;u6%6H,lпp{!}~}hb2NïQbKO"}3^yͳn.γx@T`*0)%"e*`>:և7v7kl+ijU 0F?ln79TnIb5Fo=Ü5]WuDY{~vr35cz_/rb$V {;[ѫ!.AQ{*j1h4fؽv3DhQR Ye4KTS-S K 4#qxjepIcxe-[tƢϥ]M՜bio[U2_=tLQ'o#Xᩂ Q)DBTZ $"A3dXEPVDP^jI"H5q8I#8ҝ0Xgq*/,Q ȄXtgv D "5eFIL`T Az6V^BAa P" UD(g55=A4WR$D*A=*11Mc -ܰ^v ˵Kqcpз\_=2wE,_yMy/%֊RMxxyy~\'F1Ws?iɝt%Qѥ?zu|\.?Sg?H(o>RØfMs` wǽ-N_ b4[t End*Eܥ;I.5F }[ eeL %KD-O ~#L}0P,8}ҧ'5B!kȞ# 0<=p5!*˞cH c&NɞzĶL?^d\&sp)J{0tl%F9.{䕀BǴDQI ZG`cp^"' OO'`t zan}Fn^quzhy:QBq/A)fq ~7\ҟwWU#0?q a&!#FDcĺ>9OB5Y&H*pѤ%R L1YlItc<\[qA$N݅ |dq0TԝfIH" `g) ?nǩ1$Y yOԔ8% 6(;u+,`Y"pD(%V:*ygVd jC$6L-eQ Dح``CjQ?h2Zr)%cRAS 1O '+,cjwlu$mV}*4)s Ì6R!Ra-b7"GF2FqgnLgٗWJJ (ddP9X '%ɴr*jaJ%)n isJ%/e KA4N2AwԦ& PWWS luE%bawtkx^n'}X%݇S{+z ]xtVv8]!¼۞(*wވw_ɈB !Hg? APe `Ն h m*Up;[_amE4$Bӡe0^ H́ \x'~D3۸O;~ޖ~ sf~MgAg-~>25XSp*F-85V $& RTskXH.)iddy^4cz@a%TIJ4k8\?$Ws.Ϗ~a~ߪ4U(0qn%2`!c`R" zrFT)xcBEWxAuMLT c; o<]oϳG;_]GGv:7|=Z|;=zl>׋\& VgޫyqBqm 7effպra*կvXId=mykbg%+dyq^H 4MgR5w*TB#\L$e,D,031\XĤF@h\1o5tս $JRzh*i:vPBI萒X/V Jrz/Ji>#s F>19Ő]0LwRUBUQ1:Cj̓"a"5]~!,.v|g&pbEedji᭖T'ji᭖ ݟZ⅓竖87)+O1B(i.ZݭݘQu@0EQغ3 X Bir"w͊YR.Z8HR®P}Z~~:vly4'Q:Gʹӱ{gr\xUƣnRlFj5"w!!!(3 `;a4S|[ ᝲ{ Daex5B5: f2i 7!,󻩛^*1||tְ:.`r*Dma{g cPovj[lgy7XWݦhx8/*vl~'w(Pŗ~2TC}i]asx4m^fljfTW{Dڧ:Lnh m22[n:^6ZSvZPv dm 7ȨggC<+MŝFeNfV5y#I1J)w[efpv$ԓ{=dT 9 US( so&Ԏ̪^CLxHs5s"w]16ee#^tX^LMy^kp )>_\'f}j3Ecł{ . nߎ"v>sZS =uronպ]!MBl-en֙ۮ)Ӳcu;?v qut9l{L’W[g L >nYN +lWd)HJa_놸;"oT~^ NZn㪯=nMxHOVw1=y1G'ĦߝCgnb#B=U8ەEϯ]MOgӑ B8$WOXհ `ۃF 1OA!#ZBհ`` 9p/ IEW[AC%"v׃psՁ T!>DP(pZ̈́A h'5CP(?%Q :!.XhMkm`T ~;l%owcI3ѹY.5,8M>,o!:PiQ4aLz5`ξzre*T0'T(0 XHcd"+b >MXiRgb6H3Yu:]O#'89%W#Β[? ;UO~~|G{8g$qt k%b/%i*eYf+Y2r,cP=Α6O]>NJ<%+Xj8Uۛ,a1S{y,iӥDVΐ t$O ϩ>2v%FݶJT T0%<>,#]a_NNO`kU^|E5ܾ˭_m{=vW\1|>Bl۹~v?B)3wemI 2i|H=r0$1 ooV$@AR4+xB{g Ւyb"X*F0&As .K2[YƎ'e~uof=gݓv|/ ׳t6(DR+B"T8mRP-zR= Tb]2eXSm>]p"ABǟ#Pل䩍F|irm/+zZ JZ4uK.SkH"pK+}cPd(ͱڏS;We`:PC]@.GϹsoXF"Mɐb#,4⺐j-z挣 _bQhtz?FN i"L>)Ťґ`֤dh`RSbl! BxBz/d!E$s"pjqhbP9&s!%$d@PQZ %ZXo# J VN_ne2ug^py3>"7R'g0ҀVB+?iU&4/p,f#1'}( h C0NED6XOŨ2Z4)Xen'9YAXAh6Π1r%Zh|W[A p5Z哻Z)$#Un(ͩӨqy< GT`Y՝ޠϖ*R4'YRHuI#_//N5lJr=Dž[\ y{1OR79S%i s۷OB ;?NY--5?g/WN'Ӆ Y7IWLok u?3Zl83k-x$;gVԠ{hv;_niK4 G2%]梯jRå#>)iHR=k(S0?$( :0JBoDp !7:i4K1 QTR%;׮<0$O'~.15L~_c m0/ne ~OUN2CMJe#ZRL:X`;ܯR1zT_u߽_Ez MWXt=bGe7>^_,Ә1Z+g XR7,&gu:CX/_9oO;?1BC}+eƱ3Mofӳ4/NoNONP{S>v0z=ż:SYF? JDMrcP)((/gM@XKIV۷沷69<.0!k HHe!慃( "Eht䮹;/1NPr5+05ya-iXj2M99)RU#󴻃v׶XbuXJSk9iW(0JNܺ !yfd! =#5[&KRׯ*^[ #7Qp H &ͳ48 ǍW0a8ːJKy;=jJtrWa7ks«|Rnf97hG{W6wxtI].6{ 1Yԁ"->]?]&d+~̳rv<7/EOW&b@U,tZiZ җ^ҟLjkW]]ocUA-^oy YoxϖZ= cW[rAo_p.y*D4ϝ8Z;-<)>vy&҇{ޡ xcrP)m/~_wHU^EG^R@rTuV$C*J)9הWeϫ_Ʊe iv4bn%HLG/( D.iyш5O45GW{ Ruw .ay[^}h5R %LGMՑG:HT.{Y^$#Jo a1V,h4ӄ? )@L`+*M:Z|lK; :+¶\ą1j"zX 5q Wvz4)bx N l`BsNqx5ɎW lPODi5YCLሧA׮8btv<3nL.I耨x ?2XϟQ@[OTBIŪ _GN_<Ȏ0HUwQ+KB5*}uM[5ځ9* NP9VQT+(0qmrJKowK#Q6W:/&<ȳKSt &7? 7#S n}?o%"KEȋ,U\"i4`f7NFFzbS2Q f$)XTh;])3_,Ð m6}i]H#xg: Jb4?MgvWri?4~ޢʬ{WaH!H)q'I[c&Uu$Z'O6BsKg Oю_> 7|l~ΐY`Z7G:3>?eݿt{?fx*xM&!&tӱg>: 75ޡr`>A˦j+7ί򩏠pF)9&$yaQ3<+2c5iZ NFkCrB2}m~b1iV`,iRr;Uu"\?o2,7K>~v!ާ'qO*k,fOq:L-gN $Jay|M $0 ,:2+]@Ai;]@[q{j`'YS(o 0\3n:C%xݐty/;t2fm GTW?ޔMVZjpt"r=ID))\; M"#d04`_i e81 @ͭYo:oO?+[:{1ev>_#XW;7J=GPrzA6hz_Y*vc|G9M<6^B΋x} 4G.1Lkł`Cc#.(~9U)p)T 697=}hu}_`q^lQoAuZYȖ~! Fk4oZxO뚈\(SM!RKgf;aϰ̬%0$~\:1|`%Jb3n+Y)yv#$S#ui> ^E'<fBRư~,dفd"Ő RVE' Ρԛ<&BM삡2utԥ| *J"8KQHkH tȧtp8Pg7s CSxk*6%hU.蜨oi$BT+ܰ-GHN#W uO* o@Wogz'> 'R (xOLEm_I|4՗*P46-%'lJ>q#ՑD`i4"qBe\DWr4xp7)2ͅ#Cd@t>r(Iϰ>c䠙.ݜTg{m䚲HLQYsf)- lv^鳾v=AKoUfk+m"_f;_e :1UʥE^|T _Dzv׬CQP e~{GX5C,貐1Fqߥ0N?[]Jd;Nb3߱*>d:Rbtzr8;m93yTG51-tc)ZnD/ƥS[`]W6PVL=7 ])[O]FMЅ I6ծ[VH>+`m(cM^{Fjve#i3-XpHn0%iax0庖B)Iɤ]m:{h8{Ji[$ueZ|7r%-WHTFVZu@|[QTIў iO%+O/AK$SPhEC5"2@[Z\xYjTZPkHd*urpDp-͔0^t8p·{ᡆҒS>=8ڣ|kZ~U'' ;$tZAMr| 퉆 ?ƺ93ӣGa(SUz OusW>!k굶J?j) h V/[p,}T6<4Odv^wpWqGp E6\ j7:T- ސHrKӆKRʹGè Oy6?T(#|C}H PPk"&< X 5;'˸{h:\:.<%rRt8c~4Xj"eZ[h'6E\G3 G$E=̆҆[sN\;.j=Q5XK\D'8oéxf0'b'wjњK\TpATV\.DY[Eq#vI$o ipٍΩ 9f5ZjK1_so|bVMA+f.{|5 IHΎ7![b}"?='Dx z/&x "DTrĵBtࠢJ88@ l v#Jي@Z *=#SvDnʙ7au?O@k,?? kfDH~QqBIQ!IU01or4`l)N $\?e) Ѡlyr$0 ,:2(S-@̋{u_/ *ڜIp;I,kenl.;իϳ9gw?yXfR8  M]DRF#C!;/ ՄX oIf?d|WޟU~V^Yo5qПWN%SCPb 4@~9D,QTh0UbKTw3)ƂZm LcAh5{`\%pԀ RHO+ "ڀ.^'s[ˉ35 ~+ ~1'TK$3l$ʣZEI+wyօQܾ|j:3I`x~/)e;3u2ZK=3%)r9J{Ѝ3u6#mdR3NI0 ݱymØ89`T ,q,VCw?AΕDM T(BUIgT0FDLF5ȯ]:n)p+6 Z!:HU0\O%r܈eu^*8!KFX8U OIq#`DA](E!-CAh4ڴG6q#gsV fRiOk|r1Ԃ$'Xu2։efER -_(HW?nr?yv}-Ϻ&t`?))?>Exݏ?Xlp"٫x( ShVP_l $d>S\Gh@ jүg6lz6\/kR&0WA{\ک:NU&NgR)t+f n>eX ElMW=rZzKc` U>aHuOIP؜:uP"*@n^kX kAb֪$]E@=Ec t,/Na8'YP)2hf6 7XiJcԩ\f@ΌlE<*6UX%yqT`#۸Ul 3= # A #pcا`Y??,bݮ, (YY"&npr̔JC颕&F/xQKYlt) \D\P_dOɽmX\"ZL$f'$1b xg@TBx<؄YRV9TwDnl gZP!*J&zͥ레JQ i{ Ue"ݜëY(兊ID  h:0ƹR?~ZUl=H8{Bа6S"JYIHۦ15MZш1h:^v00峋ﲴWrӭWkp$ s^InN~A'9UOZ7W=y(|zsFrU|ָ^^IB-kbFδ΅mns;(@}YTP̐!p!GkfyKeH7ru\v5~=C-! YkJo֎q,˖U{Gjf-܌P1^3vq3ӣLF/,!j}ױUx(Ҫ~UuVmZeꋃ4 m&ʄ.`o`wv(2UuH\aj"ٕna +M@q?WG;Z0|# &QN"6ݙK ӝ0b~#qyaT Y8#JC%-&s)nC'6#lv,%F߹-Ye}T,T;BHpgP]Sz;ߎڛjChyv#8ĉ}L0p<}w2%RmE1s6԰Z|{N"5 T\~0pHw7'P$GQ&qk:?I_vgg~YOF̮L'u)QCR:7Qx(PEIe;nJ|y Hd>4nyZR ?6?c)iXQ& ^̢l ˫񧀧]RvS8_mKzz*9| xK9|˰-rkVZw.A2UPvu_u GtBĺ]+-jukCB޹ɔWqX7Uޣu GtBĺP[[x},Oֆsݓ)>OsXj^aD+~#j%º,#)J=r)fn%i~\v${)dOMDS}rj&9/_\% n.=h|z^~'h G(adOe4?Q ɗ*>//7$~ Yʝ#PNoܔ:ciH(d/f'( MM,P0hq`@yMf<]pfkBxN4;:r'9<^CZf1~UwK̮ݧ˧`[q*]L37Hc# s\֝T(,`Oî ,m?q3^5ﰊMTYUu؞FH+q47X8jy(!Ju"i-(9LeFj"#2Saog_oSҕ*s06P4A,0|sܛi3qe[DW~1NX>>I=MK*V_,ObI_,| ,%Nۈ.k4)z;2t>#l!w)U.wȔV5]P{Փhox^v&tȟrrv3;ƔӾEۛM߆!;$bh&_Ʃ-w. 6gWYzo_'AYƦeR\I&_~fH}r:܌Jc*G! w+ и B\v/$岇Be^v&ҌJ.30,W*L3!,-Yf 2ˀ|" WDrM!Tkހt P)!iQ`β(M [T*j2H$ӨNw7TveP8 [@ XQ m(T&'ޱ#h+g}9uX_%BS㟚͠sEzY}0ֳ ;DB;~)ws.`w8Bo{pOUXývA&${No0<7?S*4u_w7+ɫo)ÍS 7Ue}ZcOCҜK㊬#[*hnS:K1:4G ~d&WE=EA}C_~kB8Bc;'*2L+T4TRN T櫔́Čbj1F#4k:Tam/϶ 5{H > Ցl z'Y.E K"Z,EyM-ȕAz:9YflhrEPs'g4LY 0+f)UP*ڎb~1КrB{3Yw*A֤ q;"90V+jdJ$&؀B6 !ݘ`ڳ0%"k[%` {NC1@+iU-97>ʻ10|~1MQs90R *s6ī ]+[]7xPbx*`^U~fd_<^x5Cb_ 6lAb)"<~ 4E7W y>/<]<^ Mdž} 66bpFd#+וܽ>U /s߂q{3FJ~mܹ'XysU BWưsyBM\\;|s46C]ZX{7]Q/WxHMFFz91GގL$Fj]/MVOwd$G7!(bܬ@3 oHWl7*.Ť(lTphMh:F콣pٌ H1Ȩp)Z ez0V$ YBWXKNU$3XI3C"XaX S"kr9 DVR( qp+_J% L͂|.Ozo)S7Up'Ss3"DB*k&1!xJ*9rF4˲\Կ{z|O>| "]chL:>~>bkʥJ1)T[ hIdN1fkd5A:ІZ&#ۅ(FAFN+ቩ1H2iXJP-h* *xX9#5haƣOcwfXו>r46xS֍: ${RhcSU)T*Tjgl& EPNyrKm2k/wzn}Y{M]>+XU[?=*W۫6- T2|H S6E9'$?/><} ᤺W#FwcpUZ~xNU:Q%w=|rI Gta8=ƔRUOf%fk|~|ܴ x+5!? Dx~=ccDݕ4k螛(ixj%"Wq0^Z壺%&RNܱr3IkŰf\}RBR`@Li" #D`: $ڵ5{cR9+"Ddx"q=hWS/垳Bf<:$a$rfE|_=.ѐfͰ";˧Z]>AAZVC)(:Ͱ|=VR|Ndٚ(>%E-Dkܥ<( Xڅ8tVuԻԎ޵q9jv;-_MZ<#[1AjˆZUrBDӪ hUvh1j ->fJˆ%*:jmB/`23]hD*sDI. :-R%8Y{Z{H;6'ʵEZc--͎ǜ/aSE3'LMrY n=ĶDvr⣻ [N [ ;zU%y:#"Z1MZׯ'ӴQvkM>jBVҎuZm4yh A`b1ͻH,0gO C+Y͝WRwԝ xnFs%z>_m _Cz*>w@!/I})G*6Dw0U}3b)Rb{ڃ3D1 nd 0^4iBBq.( "WZ16d/\rh>[oHj= •{7ELm\ӂw5BHs͎og5on\#Jz5X!c_ф iur>ѧ 4 X#zcmh ~SKUAIi-&K5 cІdHfy{[Cm&;sdes}LI*NcJE ƒ[`RŶN/{S*y џWi>sUrg3ѨdRg1&}nM9+7I/[dx38;vg'l&/^힑%En'[nKG}xtKÎԧY_b<`B?~eYaYaYaYDBLs&46GYŘ)4?j~X$IjNZz}Z -xP_E5ckCLNF. "k2XsѰQ2'7MR ,6r VFQ&1,ɠP.Lnj<[8.)Nf3;%N6/ 'QeۺmW[>WwecY5aݦ9_LFQne5ښOl嶗X1u;B 2ieU=%ߎ Ii=$%u:Ezwaʄvu=MըznPhչlYE*P/n6Wj1޾so_PBB <A@s42NX1b2!OɐL>sG"og]oA-p3+[MZccQ j|gJEoֆ|5eq4Lh,l LZŭ֬pkaN^Wf_Na!s*aQ00vY?]X3'!IL}Qgi%6"Y1҄y#@=w^(xEv~XTm3t몶2b-KfrsR)0u;^J!9S(bT.i\A~'g[[f>EY,3N9f#diJ%O2G_eGWPn+6{yE, .?nn2ɅzI*پ߃2D=q-3ujci}/*oj=,]U~ Ox$i8oǟ}J]Ղb+Yݔ4 xÅ1WL3 $Nl4ZKId_ DzS U3XY3`G0~i3KZ_<9'[NRvS @YqH>"(TJR#k}K#Jaܦ$Aj8)]`ah\ h+#ȑ^xo$T=y݂U ᙬG%L[a5m&X)8t,>_-:iĔJ{n5?^|x+uk# >x/aaAx]bDBoz9W)tn>jZVV=ӓ..kӓ?]>Wq|_f=?G-pDĭ% *|zjHZk/ e7HU1$N,(VJ}*rRTI- 6ـ=+UTϤOJ{2gq~(PO5y~0Jx0Fie|T2O.M7z9?ke;v6teo>FƥAꓡCfܠQ^ Y,1_:Z8`&xW jɣE.~Eƿ\XtVY,p.`h_bе䛔[斕"$O.˔aGPgʨa(ۿb"eٮ2r1-KEQJbqQf~]}ޑlL嵄(KucgKW1qU<ȉz٭<W47Nuק3HZ'4tA>~CJae \\C"4ۇg~KƧɝޣ<DŽB 2Ute^S)`8 [Ƞd_Hci'IRf-tډ\(97ׯ,]$+[rEK n<>4YtRj5q+2w)v^>uבpלܑ;{W5pwQHg҇eIPBŇt1D$6_^rEH•[,~Dju4Vtr^ (,^Dfg=/ h0MTkDr8 ߾wAhY]`ʪ01(RvcۻSR3zԄ`-o;,}k/Mñm+ٶ|ڻ9 _l디 ˶8G/p@6N7ag8Xl`hax-clW/N`Z Kpu鑓>eְ]rtR\p);O⪽Μƨ̂w&Fi+UAxN7֙Dϧy'7Lzs8=7kٷVB?= aL?\ |X=G )L>PL%Ս8T?|܈߯7 }3\c)ۣ2aѤX,9҄QE# FBrAۚnf3xT7F?m,0eu_⟹YWQ3Cδ]GV8U&qb~Q.&E2Q Z>(7nMVPTblN\Kx9.Vw!iv\}R uElYZWk$#F\J 39/}PĉK;jXe1YȏhFO?R/30N,YvtY!{~j fWpoAd>la}uޢfwx `cHDE`F RʼnWofLl9A3."79UFe~\iB% &QVB+В[v+k5 +%1t7O%UJ-u]gzE6QfTcF;1?&˪$ؖМ``'A%q%'"p^ ?!֛.Hd&y}}*jwj4%.7|Sp cwa9-a2 Dt{Db4Z|@/17"ja_{&?/YBǙm7 ^¾fJq& !L^ؤks#Nuʢ=HSxn$#>9Qޠ|580fȼ=^i}VrÌ['\{lXο*?s|.[l@]Dz̩f(8`+yoDFͱ|s-{  nbְY5 p}%wU g #EYw[zUZwNT5ZJfq֓˳0^wKOl|%3<+n/GF{۪/.o&~]k&4ۙt^a|Ok1<_Տd/TZu] bV?W+-ZzgF:xIB"E4ޒn!/[5c4nGZڭCvCB"E7qndGn] FtD3nDj>$O.eJJ|EWǺg/m~Ғrw䒤{qM{"T›ןؔٔ[CTs [qk{a,><Sb*b[߿gkg F6Ojd/;Wj7[ylVYݠC[Kk|P|e>)S͗DΗr!E-$-V~ ,Bjt9:_=GAU6<um10?% #{W $m>*#\[~N08}c*-^tj N)I&.zNW*^ۙrzEHT*$$JZ=clwڮtBli%E<\"(@[$q%<&\ Xŭ53gu^GϦS E $B\TPh[z[)1s8pn*59b&ax_w4R;9vrCkLuZ o33%wM{mq.q_\yYTSMZ`M* []Ϩv1)\͘k(7ib73wWyBvr}PQkҴAOι06v `v/;^8mY]0$'OY&>lAltoԎVf#UϪyq DzaM' |-fp=LIbiuR%v5DQlJ9潲xzm(FL`8/<*9JF{13 q6&m (QU%-ў1=/fܧ|{'$ ZbhEIťYji[rJ+ `7+2ŖY&B8F4 JH\&Ev f:-VJ ML]@D=6:`= ZHcL❯D|)'=JUƣ\@c 7u!8A)3=F |\xUUUB-YU!@S9$f:6[欛irԭԍج_q!vM\RC95Zgr33*i+Fܹ8k*(C:c_(B_9q]p5+ #0s?=檺+ h%'\ J@11XU^}_'#8|:^',€ @~8~q5SG7k-nۖ%+,4QanHWi)$'3Ɩ)e ux?]p@#bbiĄ7RTSgG:}9 *RR`S> yy$)S7uRK[,^vqa4"_i3Xv0*ٗ.&-qNRL1cBtIZ/T4;868uZdSy0g-E AsR/g4Hm[`k3&SoA aJ0Ë*\IJ]O,)Pm7rN8:O-quXjP6^W"-8jsMOSâT,,C! @#`cN1/${Ӡ*bSTB +JCvT-.83Ɩvw$]jJi6zhPXVBW#c Ѷ$X;faڦ^$bC/tc62\WMqh=c 9M? *#^/cUV>#Bҗ6clo7 DLYgWS2PEBT˝\ p匁b$3Ɩv 18@qM&n*AtXȦrF+z{)-n8$c`ZdF}It>,PEl*A\?wT62g{S3HP0+CrZUYUӃ |tHN%KN14W!u4C1v|H;L;.eTC)M N e]{P5SVaI֟*:zOXha&pҘq( wgtʃ/Maji{LPwjBSjNl݂QZP4)p^zs-p1 5ӷ,Lq&jaj{z5ռCUt^.㮭C+Cj*^X/-CDpy8V@{TUjSV$+<{VKp1{G;hqX!%3$,o{Y1liՖypu^F{$GaS&=RpBhJ`LpWՁ*Z Thi ͨmCk!.B  DbkEĨs8F"L`fJJ >0</ŪTIU|yTZH|ɂW}ur:?Sv.v~(:FDSs:EʖzH QI88ZS-b&U"WB.ݥSwv5gE2JPKoG-8Q\ 0]FA9$JEy:P늃WYȻ{lꘖmTvew B"_p=Ƴn BsF#8EEHU/b#5kR\$B$G| 7yr2';'\xҵFjE 3v~qpDsǠi4HJmȃ|?w^L^}ssa+˺ !nkdD1}2 ?R:g=GU}m k3h5 9v _G;{oYhQ4uWl~y=|랣v&w{o~v/~8'0{khaw}m6lM?gANIgtng8tg% )_ y~c+Ή|m" %Sl/oE23p+oώ(a^vM$tejCeUtF6^5f~^|1|w/<*<<] Cugl)Ap ;_I݅=xVgo@)!Waz6ȗc0ƬZo|r0M:9y^g1ݭ5)U[7@4FWgԔpt pMұgp}62@/nj_!ᱳr ob k܅pkx>1|bt]nwEk'ɇaH3>s܉˗NxtAonz>?N|4_O6G^1ڧ|N O^ Ga,J7Wl鎹͟j?;/2r9GoAuS.v&<=+<N{yߓOrili2_HNVHZ,/f0:;CO(\;'L2'+f^߾%2=0+V8?ln-m66ͪX1z$ ix[S )LX .RG^`[}?ݗwY}komݷ:o%[M2`m^|pocsSkb6^\ 5S&f_ $' 6wtq Xm-jz;[?B_CW,Cd%Wss 7] ,$ILv, +''`ݑin6^S젿 ،=~ϗ;[ &sTL2,.3YܲHZpOwV7[fdse4C Yt4, +.ǎyk& B1xk*j!%;V JcO"`鮃O%ilrg;٬;[ԫfV!u0u w9NՕ,eueŬ2U *Z~+0i,:R,:ȊJ !͟nEÉ$,K)gYJ9RYVL9+C$U*@\ցe̢Ws=N5LZeV zAGJǜaM`1ZDr$z" 9+ NVB$Ձe 6Hyd`x{JPΥgً1sc ICuEwm<^Q,a!)*6@>/<ՂG$ӈƌ j$VJhB#\GI؉XYѡrc:m ܪ(fB-`,(G U1JA72hAn3ߒhK^Hb`RD Tׁ%b3-+ܪׁܥoI$St " &"ΫlQʉJjcI n`ׁ{@p0 @ 9,F\OUԫ }z=mZO!kPxMmvh+MXD4\*`HJ.4ZbS g*~EZ]:׾@r<,O"~>05AȊaƒҶP!F`5QcFA 9:jqCe\Ɯ댫u:jqθZ]2&]ivEApwzԕUf2*x_RtIovN=کGۣ} t:D @Q:9C<54z12?sT:T&l(ͭmr<7<4 34 Hd LJFc/7g(Ũ2lK%1r[=cG7 sc5gjid9mE_m;=87o6_oooUk'tAg{6Hʏs?T2nјM >S{yȤڜ.spSҗɜNSJFьSSU;F#{h?=tљ=E0S>'k?œho{/KQFy٬-łk _Ar 5#,AJf'Uj?Rc? \\|k$$K'YJHht̓u&ۡ;|P ?iW0DOϺ'HzΠŰș! [y`(WyW)\2!ڜ}BTJzƓʜeKdGBZI)&|Ԟ;f<аg՛"DIOWyy*fȱQ֓3 H$ 0k#wQf˃uJEyYmZG.!_W2wi~TvݥRB);Cx2zGpmA~hK ٨@(##6(we1!{ϢGH):2rJUA8EvT[HY[!$; yE ^I .FUil%)\Y%֬MՏ&1(h[z2vhۏ4Ĝw=ҧݪ4A.0{6e'Lm-֘)]`6X[-FmmwݫvGhh@ e/潐b^ `5=[lYdc[ 3/*=EmFFnEmd._ݢRjJ!Zρ׸Bb*V%8k_ |oS|e*zr-*XW4%ga@(J0յϾݓ[xJK8U=nVtQnp\)=̑mEj@K9j#+W2sıFkN_R})0QK{䬙;X%З'<oW-O^_yԂSrԺ'\36N'[W᪯oS}|RپR{>sGy L V/x>M_Ñ2-'VCT{xA^EybvP" ]^^݌8Ϗd-i//uj4?6g,ǜ_:Vg1fb yZFE&_~ݽY |;"i(t~ri();`r|>O-pƬk݋l+ܝǜ){p"jQO 0fwz%;wp'LK pY2dմH gHSLfVO jۍ/ l:wA1\[¼ \48#'!JcL@oyn>͊۟=v;E24Jl8+sNjeML0+Q-?SOel^]6.}#wo\k>cN(c9zl"YmԔU#h]9OUc@Lh8ܚH@`|dw]-93A`т!Ҳ^{-,:*p2 7y r9ݞ*{zݞD'#[4rF&y g~w C`-ڕ#GπZ/O+^[b&YI"ORzM\LR,e S9oaʘ. 4̪ @{MzE]2/E((aX+`YGƃ֜c~=w Ȇ }M}<9wpf [+^{buOB{8S]_NFƣQJb>J.,!>eCȀ ϒJ:8| НhEbOk}5,~M_k}͢Y5,)E yTYJGNrD2*M=!q#h&w4_֓ ^@5@ů)3 vzwG'RE2RƁ12$%#\VhACNAɜ0J:P ݛvzzDo?=mOo{zd5T:?4oc*(*d4H_JJϺ&Eh~ru`~ w, c`+5gA$S ϗVFԀkX,-ڡ4*93^pπyIJGɿ:P,hoYPς^ ?{Գ,gAcA20`AeMnAfRQ.!L|TL+"h1ӠAM^A7A#R/oo ONCײݓEQ"{!Ĕi=t>"e! "]! '$>( {L_x_}/^ŋxŽdckm5(8㍅bu%b@#y  mFdL~ְ&X i^')amKMWnՔ&bDyk9VI5>`6_5P9h:|gڕZ:ܦryLCgÍw^<,l> !J|dfkU8?jZ/<h,,qx3KNZ$a*xDDoԾv{HU?Ԋ װXtG2*ڟkv,A^Tڛ)YQtVB&dfAd𗏻lx'f7R{GqivM>`f,F@hJ_uht jjr^byUqm!:\9-;ְhO.bΓPue)Ce_~BMF4WZK鬇iDiTLgT5nJ{^qDCcv q7Wߒ\I;i[oF77nv›hTS+H OfȕFXx4҉Y(9G@) PHBɚ=vJ.xgbaͦφjT[7ΛS`Ġ eV^ c$}ѵ2qKCW6H[HVϩYQ8؟^rDR7В_lPפrd:}Ykˏߨmw}]n.+Gg Xc Ȩ,% 2gJV<4D,̫PߧɭKNlevs=!ŋRYyZO,FomR(eNBюlRh+ "~XdIf1FqTG.N9E})S*[&`=FR]qCU GݽŇ/.!Jil%J $QQ"GDBh)!'P.gXb*; (;J imR\.ꕌچj^q+PfbOu6[McRyj\`YQaΖ&EȕBmfYM&CV]9ϥ MtL.h2hjgN*>?eO^3LiS"J)ȓ@FFC˔?H#$t "Vd]# h@-tA&4f/f .f1 T6܌'qىi#E%1b@#Q aŊG%(dG2YYEQ,Km!#b1J;KЉ ɫW%'JN^䍆d,A _^>_>T\9t\c9KWZ2wMq++*SѩDPh,zFmYWX2sD%g/E_-p3UHZK3qDj)EQMxdr2* ,)T!$'Jfm-eȋ(@ORFX_ODfc\qd=^LލoMofפy2/aS~-ězuڮ -tZ_?|WIikOr[2AD+D^++.."&ʹRg4tҝMQ' $IݧgN, n:b ˷$$9۵ۡ=!Ar0tLVE Vbv' a8/Z nr}hx܆oHHsE,YR|φY˘)- Eaj2LI\7Ls7fhäQcEk\v=%eSdac[e,O%"/$ wn2}$L$ɊLRZr-L<%(2GKW6[n'GrLJ&aLr˚fw)8U$ѸR(UQq`Fl)_@)ȅ1ʲR ORQvdHAookec՗6;JblFMɄ&sɥ06O,UQcT'e r%G<)g&57u[Rv6{T`1&W"S?{ב $ q_ @vXǻ!UC9T EiIӲL8a^͉>P x&Ø^ɟnDG? ;~f8l/{p =A2Yh4rΩ;!ZYSR0}bNf㖶 Kz p ktZFuTi]g9pcsq@:N4xڞgfOB3fV &q&HE@EHj6 $R-U24f 35F]2rS?5Z{<>b&zM좨/gSzY=dMS+ZjQkt&KgqS J]z|dk4O'pӿxl%3[!X5OhΔڐeO72rĆ`p.4YDr۠$j}{#TG<ߜIKHaxfgX R9p]!:' CO7c0^}|_Z2ci z4{07y.{+@|z98./O2!PgΟnPV˝+;ޯ!)+nl5-5E-a\\YSۤku]B9MLzhӚD:|z^H+&3w/~ɻN~1'k+Y 󪹢tvls@&CbL!flqhZ&BV28r.Kj52ZrEdst|Lt63 9fSN[Zs -yRLӃgcz܇'yf2jό<9dƦyXH 3x u(a# .<-p5XB4$/3 1a/wvxr)Ky۞g"/]#6VVK͝Eé9)Ǚ}yK}F:]Um{ -r'AtK4@UK0(A={GHZ)쨒48T3UhSg!TSCArsk^bxfO'j p#g_ooi "!Lbx#V]ȅUtf-e;n#UC8P*qg( ~!dlK@t9ͪ~!iOԁVYCwz]TZR2H$vam/8D~(Tl YL6g ]}Ov48Ĵb;r~ӍN:e`T⊜e9ۃJ1΍BASKRvQy۾gXF#ei<.{8<6e=oj}q~X-J؅/WyM5q:XhBR%{xQ d-gCX3ȡ-S2<%v9?:9{:YrmZD${rQshЋ(&hXѻc~ 4}|~ׂA]Q[OLfeu`hGvGG=h$ a-eF{Rf)^VJ ${nDipЛĎBh:Ru[Vsdj?G-;I]c<2M VU0p!ښcMAQ5 hTĥfMQˉg_f':u8-->')Ǣڄ4/&hߏ2*X* >207[ ޣf۾gR7w6?Y ɹ]CT-D}ܯ56m R;j a25".{A!\m#(e61ty9y.޲(e)p軗޽ H&jE.* \LRy؍2U99J+|eh^k-U1JIb.S`NRдT :43|CK9G2-Rۜ)(7(e`kҕF o/82ԸJ5bYLrP)C^  ̥,=Avj?ʘ.K9@2.oqq 7 \LjsK4R*dOɲ+ RX{~84\qBj)s6EgVJPK2#HZBGajm?KY깍^c(sJ륌m]>0blv>Rԣ|J*RFoL̔ŗ">m9z^uw5H9ܙwRkd5[ߺ@Lh/'W񼤋w7y֦'?__*G`$׷'vrڧ/J鲶{{fM/kIJ zWo9/^\ϫ3w5^/Oۯ0߭BJys9h2Oߜ积^׿o<yN/es/m L9.BJ 6Tf%X0UЅL%(\NPSw/I桟(6Ԟ~>~,.OKq}h;G;2EM5AؑC lGͻ <`,nu>+a3y/nD?xǧ((=PPzݾ{>-2^#_|2>{=rC`~IvZ݇Uyx-@#&@)Y&ʽNMJn;*ytEbkmEC>b9IZ; ]݅f#N PEI449i%drf[32\ HAe<J 6gć\0:QH?~}0m6EF^Az˘! M`@#LbQD +_GP0 4XDG-\@7{bn'BZ5^0deټ jw\Q2,CϑiTWYr x Rsj=ZV!2AzM{ŝ5d.vIޫXTm)кuE ZXKTpJ䬦Qz;գ4Y{ҊͮI ܒ\ӄJ\"P1(5x2AF w01@A\KeEKaG) #I'x 4g {l__!<]HlՊW &{Dv;gOt?jrE܎?EtAB[iz>ƞU8܋byDؓ`O( @d2?grxb3a PU#2#tnjՌ]I*lYmQ@:^ W:a2sJRYSZ0e9^ylqdz{ }4xe5yZx9_4*C,Mj$^2gǸ>O?@ky60r7各ܑaG#[Χ.G5~NeEu, q7=jɻwho`pJXvc{Pۿ=ymEۣ a'V_{%?(,0ZɺpZ ǭ-eW 8tVj0JAe 7@W^e6kZ[d<Ճ{cpuz癞DI3i:+. {h 'urf[H# Z#Gzrڽƌ790#G'  Y9sCN5Z7:+!Zb'"AQ{$LqB>#* i~C8n];D%1Cba}f qVe)=dZ"ص 1qwI!D ɻz_Vu7Ρ3(Tk74*C,[G禓Z~B^jep rQNU) :l^TNvl$SyZb! &KI "X"{0VR>V4P7Q܌Uc3zUS ּ*Qf{aȊwՂ'aGb TqE oX~MakH3t!R! 'fIfN1%{)kgfGlN]l&z}KXI b >>@})%剬Ml$-EC _ ( W`#{+iRЗol$G|fqpV\9vva>]a  .-$Kdo7B)i9eWt9i,4 KQRstA Vy7p"+nڼް9czHUጕ J1mc{ߜ9 A[ύ x;fMb8Ժ^N8 N"rbضԄ;&PHbk 7=(-Kz)ڕ*K Ll G.STվ){1&bڒo3_8i7'5z G _ƉdgmjA5 %%,S YR+PʨtR|C&n9GU昹R௦W74W-dƄCZREC BV0 GŸ  8 DA+lC)$' Rw,iV?]UIloZ_0GـaT:V7՟F@aAnCͅ7}U_qT~qЙ;{E>(}9nU`ӆ$q˧,_jWI~K1! wqb$pݑ{0R1M=CN Rؓ 3FsNy46yh@)+HqHD1"C)q#1c>{yhes,!`aI0eB"`*Y bE1( r$@&X,A5l*Iǥr-P6 4?d%qRhbo2jy3Zٴ $Ǐ͎KC{#P^E E}qov!?lȵ?]l E\`R/@_bj$sSJް'O[;"'Tق%oƭZ1vy"^x 0@T*؇ű=Tƀ+|oV!0m0Ѻʄ-`]xs]eQ(xr9.!Uonz1qVN3$.+,kS sJJ]{%l$5u{׷6$5hER`٬ Lૢ!BO >b(B6[oƜ0?6ً(;}Wbł9:(|x:5mcP2$CHsh~<J\fhE(4 IfcWa[fm ) S!uʹ0Ux7 l 樨a``AW%7X<ΟCclyG>#ƞ zAFJ{uc##-l1pP2L@.ؒsDJT3aIeMlˡ9CET A@)=s=QTfc3 *\7M6f<2K1+l9&&ۍP%5Y)}An8|ՑMApm+ktȶj1Mv!f]'84"tboZ퇂9:;#ڗh;r:;(g-f~8Ƅ)WΎ܆;LjTlܚèmCВ ДUR5oWV|r?qPV9)n(gU9.ԖDzR'M&샏Tq#N!lrcސhr`.=#=c(8h ,iA \Ax0Az{HChHNP҄Yá(Dn&E?dH?Xd5ܚ#֚}yy/q$$urBې_dNƧM/&dJ)Z;%BC`EN2 [(UXeJQ+l"m; POVî[[%ȥZ7]*JS֘P1aOoГB1 QߺbGTu;63+Z3PwUպ/UaJ8qo+'8 ~ `Qeoжr 钊LXq`0\nTq J֖(5YcA̧JPL w)iv p+l8)]&%묆>]4p0cǴU]4)ZڊUOcko385+difg*tfU?0~@m{F4%B"aþuU=z0=qз_.:#P~ژBrr <;*ijݎ?M4VaILj̐~oloBa6HheK2MstZO{gn/ʙEIh .g 1w?O(osG*hhoEinLğj'O}cODX$d>軗~eirԤ_g/gMdt{/Gw|={up@g/W8u6SU28{7bL懯^Og7o^\۷Ż[ua~Vb1vn{~o~o/{c|;E~Qnh`_$dۛda7/~21{{W3J\59(E,6/8^z6H^}[]6WNu}n4[~QN=Lc^"{rCp%M7S芸K8fY6pwnd9$WAvDԎςO0uZͬ=z?N5e@K$ շj>ILҗi1"^LJ}n6N/??4jeuw;?|}A2Nde}}EF,O}Wy7O4/TdzCbtk$7~M2 >̪X\}1>?Lz @?g7W ]r%k2fY2jF_gjlfzקj`Ç֟H}̧2:Y+N[F5kk&6#d`TvA֖:mci{ByIprNpR4ڲ"Iݠ!^6eJ hc7<8Nx \gE&1hLː*cJL2n!l4(IU;jmvAVs<%3%i$| ^+p̉ܧxFkpu8H5шE*(IO,_;|OzVhA?Z<1&*VΧGӧOa)YT=Q.]1KN ZF46>iNh;C}iBeY8[tI NuV8u뼽 #|&hW˓"#eznϖNVqٻQJgeQP.O!=P,L+OCۋwwV3( EdPܒ= 7sJʉ?oGwms}Νϣʅ6V)|rrT N\0 ~Is "#W<\+|~""#N}QM3m5^Ommڮ?Lp8~l6@^nX[:oa~pa{u!)\ !M1 ksiÙsWU*gşoſo?|K{~x& iZ yK|+7),UUo~:y9뎹x.iGG`W $;W~@cr=GN] 7Pl.s] ~|%P~`q9ۍC6Б2|ߗR/e'*獒gxJ*b]t* d ݍӎJMzһ?ZS@^jvE?Ɛ5m5|^g]#Ud `غs4I1oBsH߿i Tp$,xǒSG_H6jJҸ0٘V8VDniQS3?6,=L=7eWB )ZL4)%U 8N %=01!כ_ _`sJk8AkJ9C?=~*ګIvqoD, SS~Y0g?la<!!xp=C -w92(NnN12W{9+-.WN=}^ڧE>\έ=1}-"͏Iw4OW{\z)s,fr|beAX1BxfMCR ^!48>;狲Bk𾛅WaX KU^W/>>hA||a+8M_IܤiDhGfC_1?s)t-MciUV_C?j.o?ַOPEKF0B::\}%|]`(0;c3 ɨ;G9~*dTXG#-\j`ezFrWaOzk?JӊG ap>uZC,/cj^l Pyhь;ꇬY?dcMҦgiӃYuӃjpA3Ɖ4(!b@\ *h,2d st`]?>QhBb޾;y fvOw;xcKu&P$Lfǐ`h6(T $>•WfhFT聕KG<5J o1FlT{+3ذwaKvVkr-oy(]`5pc0WW_wD>\C|i/Fn~i9##D%z?xv-Oxу7_ MvAtRðN9UMC brܰSה1:-YGkSZ_TU<8(,8G[w+Kmyꝗr3n+ *-Q)Ajw%zTD;wxH1xZOPPEppu}ٺ&6U;v*N<֦\}d}\WT5MTmDhs#d.,^^u#e1/ JG< > hj e3lG+{n=HY8Xrqֱk%36:XO,M:=h J\VcEYNc_,Lbwv!*j]TXD2{#3il3%fY O6mT SW> 5v dzsz=bHk5 0M-EF[!MD b`mL$$H g᪀$kȩr A.Co /?g>ƨe*fю8bB:a” sMrPA!bQx^M2 yap<6"LY7 ,WMC"PAәa eNG(JU" r `>L$<^q"h0FبJ)V.*z݄ۀs)E*+qʟ^߇?*f˅`L͎n;5mƓ7@Zˋ`^ˋ߉F|VtWiΠ<=KІ_^Gm/_'/q縍=93pDtLG:Ab8Yp1@cɌGV{cloGD@ rj~gJUTVTt5B&]WiN3C ZjVqO3H4 w/|8F Xtod'h0C9KZVݢ@ %|>;r@ȓoBDpX8YX\B?'o:4C hR8#3ZߴhAA= m+FtA Cݜ т+|2Ĕeѿe";}a|1mQhL&$k8- %@-y= liv/ Qu9)gy/XHB w. F(@:/`(Ώ685m b0{5BnJ/,AArB)}ޥJ|Սl&Z-L"E#FC X8,m<QBtQX'W d 4 S+$Rg5Y铜%>a ߪs_vRMS_*EHQ߳cg 4 JD u={ A+zWm.XgrÂA G6pSFd LXP4=8 T6 #OdK=DA@r-4P=IbTi"CI1hh"Ǽi ߂|XAࣇsAXmb0*0,m4b#=ű<{ٻ;Y},js HZ#@cxiGAk8pPȒ&R5wy#@HjP8\ʷ V6&jV,cYKFUj`Ml>K[[ǹIE-:Ks^r.":1(cvE,S`E)\lU&vbƷ̀7gqgZ8BnoWuXHN_.05HtdqUSP%P>iΞ!Og°u?sQZ字Yѳhti{ȖmyƑ3Ԗcn[֛>w(rySlLK8p9ŽM1vgdıѦ}wSԎb_fbGpb ON+>X>N\`GWv ؛+s98J@HN 2YqCla7)KxI;۟p/]*~rq׻sp>YzGVl)yU`a[Ӷ[)h)ߟ ޻9Ŵa2G9νQ[BtQ `ֈ@/rlhW%itd ``5+ +1;SɗX1f%j΀Z2DJ|O>NC[qBvX0y}1(ό?"n:/ 6Bugb:hCCzV7VaG|~0H'ճiO}L<;?; >G;o9Hԅ/B#t9mgLUAc+4=Mp5*i* *jQZM%.֒}ħ\X׵([h*cN@]LeU4IT[qy[k{8\$"]uVQzVqÃ讃z6fs/Uns0C\˝  |cr7>MC>`,=u wZ("]Gy6tM~|z[^OH$-cR kRȤTCdAu`NrOKq`ԒOS}+ך?=:xQ qt$wz61>G91#|Jgu4aFKlAhx%қ} H#O^ImqN@VPV. [7'6ubLqɴ̓ Lһ+ҏPw^zN،"Trdz*71o!fCktIwggvvP;B-@}}iz V0m->єdRX z +]߿Ȕ\ DcېiQaLh#dsLXtSn1k/<&kn"zwY~&s#2?4v{AJʧY(Ͷ߃}`{o.V7WƘҔm)Jȵ%JIu @ vN+&f7|wsC>I']|D2Iv| m(Od$c-¨mzwWNqVorr7zBnY%57d3g/S9Xd߾N *V &Jy|J;/ )Ic^cP(["b&{Zl&WX{`!B㬋`^\Pu5%?qxZIr>=}z9$ `}9m鵅2|1Z@)-߫i/)h6?MenE6.jA[ f(W}#Yc bT Ld$G"QC>P JT)&}3F%rI(2?-MkIaQíV8v.0:o/˻h$&W7β$^*ʝ_xyyq_.2b\<˗=%+ty`O!^gKȿz!>ځT>̻n; O(:jWф]e>) ٚY3ب{zoF@>~2E=.H׭[ p /l21V^p{e-[4Tֶqby]* >ƜݶT ιBgBov57eS>Xng(&;lvo.?+CgfcE۳X6`9"k' vC<Ŏs -@pE-~!`WHO-PjsG(dbˋ3 <_k8&]hO9vQkr5n| hҪ',BR6KWNpԊ\6׻xr}^ 0@J|ULX gGu)cJYZA}dN(,FG>*5X`#z[qpݫM64[4(j/:_^XEU{V!Rv6FmFi7z'A!WJ\虚a/vM^/8,_Tq\oRs6oKžSZtE )!n" a-X 6+i=#S\O|dܕQs:']?~qQaJ {"%{Y4SI@Ulc XNst rAEA&s9ӰJbx\ P6ɇ"4Bwo㢚dnB'(G*;dYo`>iKҘ7t׎Vk{P\QGԊ'A[yt$`Vђa{<yh7w }yqr1 + 7L\D,=L~z]Hj-l@$eɕA!.@Tyb@JCY2\%( DZG.XҒN! v%j-6b؁VL?YbQ,s-Yb[2h5C_P;=g8*?xwbɂJ""o#, i8@2n1ȱՒSB;ku5G>p̃1)ZQ^R]+ 1E&.Z-qt[nU($郜ӫrMMeM;S Fqq#b)r68՚ЦjhIjRqnC V(&D'_/_?l؈6Z~85Jd 8 &;ݶeۅ^0ɶ}+Xb`!^v\`GӸ/{, ͛mxS`(iIa7ӻ\crw䢹IV1GmVw))yBcӿ!1mL~SyYě-8RiQRD38 35-EЕ5UY,ee9/ pmbF$ʳn.ͧ l1H<+V#HԅFs0˘ ;K7vCuv{/1XGS[L'pG eΖl2^G;d-l'M+|ԆL!أ6($ Άm)::^vR+Xv2۫+$HՊQ܏s{B 4vAɽKB[׮: ! |kwn3OkSXb\?XB2΀T6 EŶe^o?1*1]R?](lh7|xcy~S_|>M"d?{:b [ Ԇ 87;iMK1ZʀO1Zw:4 ~5b'4{;(AX"Tۏ"8I2l_ 2":bEp=A1%fm6qD&sZӡ_DR[cjyLm9-ԖRnTADoAG,eVH  h%U(Nerߟ%gjި08ri5h+ 38O_Gc~4Wd m*[މyOQ)xњ^EhFp\ |PZ%6N:V_H/\7ծ'ɵZao ?|BeCrP!8"q& 4M.erqs1qն7fI/{Os ! DET݂wF1RKA>jᮏ IPF57vep&ʧ9R;N jȢ(]\ c51}1|^QJʒ9DFP;K,b( ]K6`Hz禝 aOnBRZ"\) :,טn Hy0G-Ǖ_G?a|Gj)H{]0kp$2.~vT{$ >=9q4R/- #BQVeߛVK;(i:aҝ*צd"pdKfOe {y-EMbPaB p\qO=Z^#}#Efg,Tb\6 L_UQHHbwV)Y=88BvIICnշ jtOeT:dXR~:-fe访| zO΃s94uprpN5s))ؿ2|+ fqۊkB$wGPL呰JwLwNip|XT[c`iI%#ӱh5^i-OçʏDv:{]Q>{V!ǔ`Tsk*櫕R!-U D!&XJ"+pJ0-"$A*|֪ZMu٭:guqwM䞔Z KSA7`#@)bMP( 3)Y@z2YΎLX rʍjigG7g߲vPysX%]J\+o1+Ild9 :)/tXT[$ 1!K^qەLJ,W Er$Zn gəu7& PȻ$BqR C1$TRb&q TR`iո%=ܻ+ Y${I$aÃ&!Fn{ЍI$a&!&͛@'yIH_iu'ӈ zL޷i )W,{W{6O(Lӽ"?E1%rׅBw~{;:HTAwu I \rPN0:8^wv F)bSHx+N ˾NQZ3H(ר2~u䊇Y-y7kOm( T3~ڟ"bYj"4#~n{#6=AQ01WG??] |RT>} k 1U:1Fr8?Iq4.APŦo*G];sJ{o>?  n?4|H"ǖ1aBGiwE6~HĢ;(FkWE25[p_jDzv"svOfɜQ;m>W ܈a l0?lN~Ӫe_G0K%kö`~)O(e_DBD)ӧ Gfgimz#c}w߳-273˘?b)UlZ>](lPf7|i#GN;s v: ,0r,Ps~7Rj(QpHsG?0?EOGob5(wѲ:Cl wdWvKBz_Ⱦ؞Ԟ>Ț CC.f PD-Ϝ`[/G߳;͎mOb{h]:ĉcN-v^_Ϸ)GTLé L]y9OL2kL)qsWo_Xu j-QvUϘ9!at#zٿ ׅD sV` D]]ִᔑ6j S=k ~,I2ƗA0Ά{;_6Ѿ4tccSGKֲeG J$ CRERj`; $OaiE"vNuE{).QK\h-%\Ed,2aLs.xeC@c0UfL%7ƶbW#hXO3PO+c m#y#dV:.@ JyF@RS>{ʁ@ ܘhH6Ĺq2\"7z{3\Voȍ'ߘ'3='ˡO:Iyr5YIH VڹNƟGr/}gG,RgE qTu Bsg~<(Nv ;v~s;?[:w޼{Q1Z'b*#Kh<e_ać&W/`*ߛLI&X+1BвpAi M-%DNNkTKl+D`H5$ WF!')MY-f~ $vl>&2el@)dӅa PQܩ5P6z3^2D<7[I^!hX IΞfۊ%YN8}ݒ5dGCeOY]U Y(Qz \*WTnBq*Fez mQ(nmS{8R\#X$Fk=2ij:bΐ/ouRFnq٦ MS7rKB'l)}D{zra61PKZ'Co񤱏Fz9T؆sֆ#%ڮwh2iȜ|' 8 Tk.7.Ac1G.F'v0kƍ-~ T%eY G^N*)K](+QQJY\ ݏe (#؃PA #O$JZtc 8F{W *јYi 4%jt ?bcwrQkm^‹⯓+B6n|=iUy{DBzh-_~ͷx܂>???`wrsOӝ_tT~?`U^]l柷ggnpMÀaAK~ { %YmSk#h[iR5pKآk^`"Vt]ӡ JQQ&qTͲpAx҆4KR5a5HfTVh2xKV;6 iEEO;=vh-fBf8Sϩer5p CDu8yHɅ )' 3qZ JfÒ6Ҥk={(O\9uHUkkpt/BL7du&?_λ~Y1)]%jZ'7W=GB]HbYW:eLN^Hȣ {7eRb^ۻp\mrQ#Iݷ3tW;zD }qW .~|}#ɑ.ЙuH!Lf\e92\jW*'#ߟloe]X;f?n=X|ګ.9ޛӛaRda`{$ 6YZ#{j`=t/0@;D]5 墵0hvڬQjMGIIII]uܗwȄSzMC砕uJJg5zF#x͜gps wc;eا{Ԓe,WcDTL3,kc"dejа,^0lê1#T^֤qaE[uZZMlx|zw"KIOj0wXT"k|lT*~R"m"5'j==T$39;vFφLI'oa-0!Lg;}؋܎ Yҝu@%g6$qnOp<71Рf^@ ˪7ӌ/"̉t$zЯ탐h5Eq8SyDJaG,0Ӵf2ȌW2++#Kυ|di@dE= Kk|4"F^-s$ d$!ĕ@/?-F4IwE<\-)A4/K,[fMM]*HEs["kOddkt2W((W,ӣۼֈJD^}%E ϽB* <Y 8.>sEʅB>[is-ZH5uV6JZ6$F+}!Tfް!!T{n_D[6:}Dƨ ,F|nsL"ZJ&ʜ" ﷿9Wsh 8F0)J.-sp0j( ]djUgelc[V;-m:ٚ+ƅ2gog5HEzT~DM;W]Ijù^q7c*S߻-0z$Ic[]KJWW%2sW/Ӭr3"i|5(kϾ^~YQicWDqj+S8˽~Ni_ɡ%Ի) 4 t,[}sEN!3ܱKjzw$W|B}j(JULh,riCu}r[T[̄mCBӜO!z?߿ċ0j2sl`u[hboE>[i+ZA4D37J[:Шh#؋6).'UT2ZQޜ`ἡ-pxӇ_` uWJ nY>ji1|A>ū`N^f"S#2܊ʗ`Yljjz3I ~4j~zˬpYNzI N,ML4}+e n-vԆcZ~'&!KfXRU.-zrA5=,T/p(Ts!a0{OA/5"xiJOWhn}AVSzjONU S$IH5uכ/' @!M'Jc^%l/kށ<+|&8cy|lN{o&G(3MAE1e%\tt< ' 4mQ+BsqaOڳM˜u,zca+noV"El?TF{zra61Tb/i]>5ƭlxtrA60 A/KMꬦtN*Qe'2 ]I tީdVSӸ'́FenwleTH1O (dM3jf@<1'w^*ݙw<ˏ߃<*O_{}qWGgfQݣ6swȥYBgh#Y+1c'Glln^w - j|orx}F:Yo|E}s`rNLI9lF(N?o(Ꭳ%0w*W*T%B! PKU8S׵ ެbT2^*u^*,WAz6ݬ\|I8-{OY'G 7HX]<"…5|iae[2/2NҖs9R=23_j\ErJD-YUrd!xIbq{~/GX?Ԉ^ vy7\Vea,sQs5\RyVˆ>Ja|WF{53祮*0(m2MSZxA!BUU/kp|kMr.^>sL3bF!7AeSN '=͌A貲Vx,{q;˹dKf)o=ld \X^10oyu>z{65 J+B 9ɫ k H^7UU'$TUzUU=y-ddo Sg+Ye$^ZU(e ؈PQ( D׾5<~߃= `O=ؓp~( ;%` ;С>We%3#. O&bKDd2V?NK?N|LӖ;xq4^gq:<8C<(jgN瞋 cj1YV 4Lq-ǚSðj]^\gzF_1{@/շ0Hs0dij;,ERI[,HIvSM 3Ihjf"a C?(Dg߿_oOv46aWWy{ûlœENI:Y뵼}Tk>~./a?+}:Y P“-KVkbx4ztg4']IM\ڿ{dR"Yyj0b&X}DT>p/Tîx7J{=H3yݭZ؆36-룍F~>>d^'EwTGXFl& y~G9nb01TG*ƍDzLO@X+<, C#ǀo{_l'86hmk%8jDP;tœw-a5Ԡ [)íκ!!M(S 90;VgAplww\~7;NHbݿᜏDbУ1ս3E`*p2wDOmL7`4By"q?2A11B wѨgJATpFjJ]W .Đh/~=b+"yD nYּ`Xuh7Brm8.ʣmV@; r9n$zsG~l{lH5F"fm*y`|< apBʄBs%Ac G+ס[OZFBfyq}hV,Z\:)_EF Wbq\fL X6AߩmWJ& S{l9ؚ0jcۮ/$ޥFW2_ }!oףVȄ[=ȔlM<֫k|Q5;_t@ٚ1pwWJvΠv(Y=&SeΓ51Dz?^LaY=h-;x[ 3B2#n m/mxKCTv5}֊ΔHlff^C؈u7z,LIf3Ի$pm;-t†&~RrMa=.lM܁,Aٰឳpuw6r4jЮV¿"uq[?93OmC6G9~s [N8qvY8Laӝ=dtQCGyJ#Gy/+os̯Tu|}}?\Y]>CmS0p,hŘB S-YR#p`v7 9#O#u\c7Z= 8uVCpsa#B Ic+$oN^gd4M{Ĝ JH| uJ Nf~fj&ap,i0daZ?938GJBQw6Xstt燠r2\qZr^"!TdjZZPNNC'^w32Rkp4E1!!IK4@O(LW2F!({^T4R 43y0zƔSœz( Qo̻p5  ^b䰁N9g<#"IS"4R&f,&P<)$L ^k869g_e# O|8ޟ+~z"@j|xǧ?ܼ\Q(?7Yx`zŻ_:D$`;N\!zwy'F|~p/mog? ?<9%匠(lbRR@3ZDzBFIM} O6,~EE>NrQ+zՐ8)5%vz(Å+T0YD2\KFP'X텍+md\ĐJmj ݲJ*`_RefʹREWYyE%gN <L>[Ny}C.S",;$m~<{0y\[I 0PSHCa;1Rs]x9OA;z6<&UxОjA,ysFpk$ܲ oCK}wgǻ l ADLxpL$坺^ƁtqP*?[n8?Z&β]Ϯ郞Efz!svngH~ULiz~DϫHd`{<ԁ I$,6Koooޤ!,"OuD,aL.LHCĚH J j,JDH%4\jio)iJI>`eV̴k8&{I@EBѢ0)L\4.Kppw@>}ӭ g߿L‡z̀W:>}<UO5o\I8 ҹmלoW{I:YgEr>gƻ?(COT.A\5һ˝(\OkǨf:RJX$K̚׵FAKtsPsnYo oγCu"Z,F1 *acǡ޵U8s.]Ą vջ֟B5fCFe)0F1,r (ҖIqU%$_\r9OIzz[Nrz+W :8'jN6P3 6O㧐"v]X !y:E*pͧi'd-o[bӥ׆Fŵqv0ъPDMS h1܆bۄC:YnN`FV\t1O\IR((H]LbI2_?.ZѨQڛ5ҝxB5: \m3+ͧ#/יj/"A:YR^1pDU[P*hUۋ|RtǗ"ỸN+ie,Bϛq~zIHB͔$*#237Jggr[d%Pkʇ&Z6;[pCg C&$Iʝwjs_=T##p`l8-c{yS1-+TsZ%a,f91Ů-.Dpݹyp9a:]<Ǝ [mAU N-KϺM1j##Ś> 1k cm.0FgO|1[+2c~5epo>FT.>-]B[uj/|)]~O@]qZG4Lk9q`;K 2 e<cH%\)!+1:oԄ>$0o1=GL!s$+~':[!psX= XaKt~k(2HLcmT RZIwsuFi黕pV-4o& hqBE"R)w怺0wLL+%8ʍc5V]E =saZsCˆTg紻 t| 4K*oe0 7oxRFaF! ||L3 !Q ywԸu>w@X145pv·er}^Zc`}픳Ar8w5!ۓr*}8bN6Ջ֢:9~H4dp; v(GgL陜pKUS=)4 H9Ѩ楗^ϔnI~o6D wO0[;q\2OYjls&#da" $:4 ޾UԂֶFvF/.e:-܎W~}aqc+zS҃ 9FwԂ3FEK dkybۢI+Zqf`@C5"`14v|NΘO{oj?T|cuxƛs[SOnHߗ榍JeZ3qu6O݃ĤiY:Pa%K4Ƞ}-86)u\ s4c$7 ᾲc5(5{ǚQ5ra+p,3$`8y})qD[fǯՠrו0*ΓzuH &p$Dgs[Lp=9+IvB}. Hlm S7?&_[,ó5B+r\6[mlMϩ1"w @ E; g?>vvu6w$1pB("K`-^޵RiGnN6Z3w|tniBDAq͉!/!Ē%URq{K;=_r+Oc-, b8 t K':aP {JM|9d9k>kdw#Q PAD<%Px.K*HNJ>rF~B藹U񍁪0R5GmØc9qE42@!&l1 Ƞ㉩Ӊh}ʈ{-I^xogw=ѯ)ۛ@gU7X H+ }0k逖Wsl6+mKofdF{*v9/em}EX_y(NJVU@QpN^5))pOm= 1.a4'aT^WʥXV(o?~ܴQr6Igw'L~At}KGиM rS(k$ B#fLO`$V[V׺Ws_R@' BM/هZB?ċe6X|TJ`4$Ⅶ=3 0H?~N?{7ݫb}c?@wؿ ? _C'!Doz5:5%zGqƹf5r^ڈx8Ͻo Z3%? Fmxr<:eCn[SV#SS=M\uJ o p2/_N%(cdY|5r'΄G&q3_0Ss; y@x1 ==vW3qw9d2>41mbxRhLI{K T XиM{#CQܨ?Lhᄹ Ŗ0LDqI4A% S(0=ϧ +Uv4fǾ?PΧ<;QFq&OD}'dO_왝]=kr߹3D䲓`j$Fo _ 8m1+CeSگVEfؓ5x4#7x*m /=OYy6š_ֲͨbP]k%L59e!p唭y5Mg8' ek,S K ϴcTlU (s!+.Sa^(bNj$YRÝ"AT9bG()Id8}EX3E%ïݒۛ˺ٝ>@yxDDp߆xo~Bx9Of\||d:]e7gE?3`GWٖ5a&oo<\ c/DJis`ZySJTZj_6%q^7%q^߬/ !HXe1pPk3G^pUgSMzCMu)nzis*ٶ2U[*nF̖u,T" |J9a.1#WPZ`B*KSIjÙq+#[d d ~ ^@x[ 7(SdQ:YwJ JLDH)"RNH7GhD@|u;pB(@ύ tA  *AX^r|JHfRKc1ZΑP1'`C;痂ݧE3n 8o/ / 7{/c~ƨkzƨkz$.YmĦ2@#Aq!txJR&" y? X#=œC“UTiwoJcRJR9U%S59a”"$s !>BNcB,977EaS#.3e]OBqM =>5 W's.%4BR8VY[4s:&3?\~^~nO 8eS0|8gr72nÛn`tLٞl./ÊeXs$, ib I xk| "Py57ިX' mEب>|4dGFI",> ;dǗҳEX%tjI/T7s/)P;ԺEus$qڮN$lo=mrOX؝19OrC`Ngs!D+8:_hϱ;2nZ#ma#rq"upx}wva² frwݽ|6=2.*t=3Fm WVȇs}Ue@VHO}y%*grLXTJA5m^c!p[ jRuT ->X3wWtt ]\T0 d58G^բ*EaڤVךNJdV u^߉N$)["QSDaUNKQ/Fl!X\Zwi1SCȠi&/ynVs.Tb겕@s~L|W8]:(#'!K |! )ծaH(.ٽDuGTDrw%)uv۸`:7Rn$ wF9orOc#R5vY Q iti$ ١NHAE)mR;Xd˨V*C!)`Ⱥ|9@,7,xQn5Dyh K(]"Ryj%ogmsF뛳9R#t By(mrx!Z6 C-޴SSCGm` /R#l8xT뭇g!ڶ{Zm}m<9[HieTbՈއ نӉ]qsJ0Ғ]-+9rIizFKmQg%֞0Nu`$<0|}Z}UgѸp~:IU>SNұRӿi4&qLL./1|^s?W3q+2 ͯ_@BMOd&v]W}^/WN^@Knl5o]{)KGhbyZb Vyf/atL̘<:v\.UZDEb&eT0Cba!C-XXWRĹcSv*-e.O.5FxN@,x.A 7K16ɗ/VE&Z`a ;a/E9KVy`^`|%FG{Cd;?'wlh't1-S",Q9H'(AxBVaCn<'S2(U Ixt9JυprL ZxAb'b=\sM<2{٫AUyfL%ϙm8E?>a=ϾJ߽ Qa4qv3pв|LJی"349Rv]wzO[&}??{Wܶ fiUARIr\إf1E$eQ忿= Vq fz{ڇ%PM98Q 5`/7 BIK w[ӉƩY#|>cUdĸ6˨Ih?v(Oǃa4N7_ɹbs{>| >^̓`|='ٻ_^~u| a%xjd-[\Trot`%-!qwD&ל7eCn%ܥ\s 䗤_6ALw?9~qz1a^_ٻ'y{|ߜ\:GN^^L{_޼93{ӓ߯|8<޿;?> ]? DƓΰ9ܵ;>wx}޽i]tI`eo][<ܻr_s Rzeތ0<_^Şupy/iiJߞՇvBx?=aEU_@ ^chQ 㐕 z9QELb"iGn:U8mwB2mh4{73ݤt]H =Labi>_xpiЮ-^@ًOPA+q7M08}uw uOL%wz |?ܘ y A8v~M8=erлCބ!L̜} mE17H?r<7 +q{wu8+x0Uqlv^߻cS OrH8qtqE:Di kyәy Ӕ{ʢȓ&*"ea5e8ЈALpHҊ$C)re4\1n,1Fpԃz̟)>>Wz} ~|vN˄)?sϢ.^5bo,~٧Jˉ,moA&vy(4r{beR5ZF.z, ۤ( RCdK:$jbaL vzELqSN1$1) ' 3y:0Р>)'JQNDAKp 4<--sNWa7-.nFkE$Z^J5ǽЪ 7!4OF31I3xL,k&O`Oi&ܝf"OL,G,2\hJdJP2P[%YR5A؞xBǴTIZ +I4จ^3{[n-jW,_l%wm:2:R]Q sꥒVɛvm/SfK^[U]c]\ Z ?D&הϐo<#}:3 K>8J`wJ)릫wt`nSYeniXx vGbcs]&הא K\u^bLvK?#?hcW?x].\D8ݍARe?4 s&lf&cL%auXyM볃F˶7WϊI=4+ӖJ3+ИYcVTlna,[ Z-k9*x՜BcG¸ԇx3bBtTڙ.|SFB`So"_T1i$)Ce>ײlu9EdbQM}nKuZ^HZ;%)WdIy;ƈpW%UR4)5Ѳzuɖ =EqSs|?)w}8TG}Ipte~x{ǬrOY~S;eys}f5֍nz]XM&9hyѾ:&NW'=y|dLn-^`I:ptFG 9svycVc<#vX}9;sKBBǝo}w剃v^iF}rN䜜^Q!cc|.=_r , C¡G# ,p)y?ĜXϹĢcI)Fl`0F#<0Dmİ0T+", 6R}8?}mpJ>,Fyk8;uhDµ+&R{!x5*A?nAguԠ;L( DV0ȄAB@̷+$ ZASى'u<~ڞDSkHi/Nߞ W.()@'Їe@ ڵq†3w '))tG㲷oB~tpZ;}*@V)'M5ȂxݶO8uRߥh "} A?6d.¬}o-\g 8*\6}u*U7 lof_Rr͔Fz[';2{{f<uB3A*ӂi߹nɶO_eIz*f>;zac`/_n MGR'hԕs#6?+@'qJ: ۴aJ_A;^^>9%8biܹZaDz$"0 @= Ul{eLnɫk?"4Ug)/j֧fP0?/`W$<7e<#.u? ঌ=?6qt 1Vy\Ng r",l*oZ ϟW~w|vv|ǎ]WO pd[ߴ0mvJk|C"baG&)fCӍ4oKd>ymio>y MP,GL䏶c9)#$5F01"/0"(;K# vm )2TҒf3g6Yxd+!P~JKA&j)ޓY.DIj-rH\/jT&lHS+Zj&Wj ^^}-:XB$ON͔XyأŒ۲[%gXnsvm{&)cG1)tqB)ק}rFo);Z.>M#Nm9Oq$sc*:A+-7~UN& ݙC^N&I];/WQV8Da C*8 T~RwYP )U*ģ2 AІJ6R!*2Ҽỹ~@fiR wo'/&^Tkf>lcP?1)fH V̟ 2,0ߪ/)0TTycQ)Ga: a:i:3g8%}x/zE:37PHM ï&R"䏵nj16?NtYJ骳%/VHM1Ju75V&הϐ3*ޚ/!O2gPy%C8Y3xݠdp3k+liUULlQk4lh-w1Gw3*q{e6F0UhҶ ߾+.?R'$g,"OY{ 6OBVFL4``rQ%Z$Lrh@c[#Q@Сe,$GKr Ī<G/,)3)8R&\(Rܧ>C Xbo0,Hs2B2$iY8 B [i&-r! ҈ ?ZHlqHϊPRLJn`  )}p+C?P-Q82Ynj0Frqa3^h91Xq h+N62҄j 0S(rzQsWbb^DzNsYrAFp. 6*dŃfkQiGᄂȑrv5A x570s2 k,4 ✇DFD& H!#0dh0VThZ~ إAp%LLcI"st.⃳u2"PU)䈺B[>Q7K"橼p\w|JFgxYr(^yNaNwTe$Q)1J|`:e3 1#]pw TIiYd87Vc)AD`@aʪNX2B<+i+-KF0tB"C[m$1!: Gcp h:ls`w9j 0#D*QZV)p6HZaX$4J"l@a4BZ n| R@&_ G$* څοxW 9ҳd=ǩkkr8ŗ\jRJKR0k^$)CrI, BKY%YKp}=sp[ qͷ_7"&wϟ9]2>]v}?x~)5"4Zuޟ߿d:[aR}Vƫ}xgJQ~orC6 -k.":愹Ti |+xFؔWQs;.%t@קŎm-lptIu^j_rT|ii!SVѳ(c1ffFf* tS8QQM0$@R ~p>"hbznj$DcsAΞ28Khs2i{o¤Dp4m&b<5!CyԤu$Nk)ge O&縱vV U /Qc ulxt%g8jnCN35G@b ['l^ *^ rVnae8L͛ڸ.aѭ;Nj}(!\w7mvȘ :'DDF=Bn)t|' 2^[YШLz5\=)@.TggaLsP"9Jk͑~#"_X欁,jTr A咍9oqP%Af$(<8Q8o2( y 8kUI`x\\tS (T 8K5蜫4 !L1F$iz64PԚ*zg4~ѰHE! YY-jm3jg|ZlAVǽ}:H8 t YEB&icEY'/.Xj|8E #-Ȝ 2MJD48S'H߄)z EL0e0{;N l2ޜ>Vաۊ0cFbщb8\V>Vy=C`'GȗrɐP Q[Dԁנ>KsgK] Kqǃct^ݔz84RT`T3ta9hO6G;'@&2WnHtI$mc0D#_U{\ή^gwD=Š@L0F1:fI.)$X8lJ HNY 3苵lh<[nV#[+{WsQh.9ZoTb^V/O+\vk3Jt>-/'|4;͒%7rC97x3M[|xS}{W2DS->O2Ƶ<_APv?C 'ɋ뚴 FZ@;4B~|t%}t:4'{|`I+..;5_r.P zjѡN2,_\&xk  PS͵{sB^ѹgO'|Bsw j0RSFyKL$ݐHq,#~O ]lG>^jCȋlUD%ǤSi~70sYkuh#hzx͞5:rrڐ3qY,3TΗ mJ6c#>E㏵D0P1[{UtA(4+-l^Oɓ_-cTD2lGX Q/tLF%X36B^شy莨c1TZRiLp5USQSFRҡ9TI&LMu!h*BȡmG#gҥ8rojJ) H$8Euh*=-WB yE!5nZ+WY7[C[ÝQuE( 7A6j& U3/ɥT4fi׿MҰY!K$/۽L&Eyw>_֊jK5]=MuS୅! }]ۇ`.ˆRXtE hUNqqU;zsu^X<]֦RJ !NL9Roݚi9~q bTrIK iOte(\L AMӗL2=DH﮶bאZQey<$~.`4k ?ԜJ/?UXe6:1"8e"RG(Rcm_`x^cDz;ܽ 2ܽ A_vg8V^O,Ue 6v۾PF`pt=p:p!T\%W/kJZDVPۺq5Sm+U QzV;(xy[ yv:t]8W>4Ԣy%J+jDۆW?0욁<6ȹ1F0բ 'AA^ 5=\C@5s9Vi'=LR glh(U xd H#mXƬLMbMJsYHLԎ)<StSÌЌYE9Ʈhh5$ihhyU[:T6׀SBBA["qǜ V"2&9QY*4#kМX *ebeSD72KH拋xBMf(ڐpds4e\l)NZ6~ۊ#Q7 ->] @mQUQdS]gƟOM,1F%Iگ!栎wU{#|pkRKNe.UwRKhjAjK=h[7ɫU7LRV`ۨTjV/sYylf;Zvhم+F/Ph.&hݎD q9՟;4+8+Дq|osjVQ+vjtNܩK%bqBfL$Ș;+N8*2cf1`Id 24s :TB1ӮJ]qi.t~@KZ0寮T琋O|!hp\dbxꀤ1~#Es S |A&܀9Khh޻G/ I/iABP)3B ΪcTYD,MLՈ.apC57 }<Sg (~.Ek_!-=U_[-YkN)xU7pE&1 VӰkbjo l\&Eg悝[Pos]sJ Ŀa8AsF;x\3xA(Z֐3xAƎ;)B,BK;ƭNga<:{|I|l' `Zd`.7QuAC:Ъ#}~:(z計&?|˨H |&?dIU?BmQ&Mpi0z]l)i83+ ztli0#f2y]3;vM[>0+Fk|Dn?9t>u0Wq쎩͖@8Yn |\ܮ_5[T8L!m]:P;T"ՂE)czq\8il߹σ}ѿkf@lPQ8Vc(88n ffU xf%oAkC9hqծ3d6S{0C>*'>]<#.cѓ3 4W _X8MuƩ~y ɎG"ڑ%{v3;̐݉`is!GhDw/\S}|t]p/$x{7ti7@L .q A'o ?gX̆Eϫ?``+&U%gւސ(ZkF'SS#!Z3Z]55TȪa˺Ji вщ7ZB{0oDng;ށMnN#㶌1Nc8`Z1ܚkC4X4ń&t~'%FGZf"%QLuQ-u*M,*y;OԖc)Lւ[e}) qu%?m"jW,2 hrI1ޛŰ| xK!5mxdM+ +EW.pP/T8wO" ˭ \8bpT&;I/rl`߃F>~Yھxyi"(l$88+-w.)Րѱc/C H[{dDIyqh[2B ~Jq$q0݄%!‘JTfQȥ'ΒUP7^)5geRO"sG_㣯1_cq$XẃvF3a R_'T?xEF$1~-~2//#wHUDpq=x[bG-ٛ=K fAZGkfSj ȫ-JW3LrWtm d)Φ;oƓ za`'dKCFQ~v_0RD\;[Mp,)g)3nh~O .=7Fi#?~dWS9SR- S3gEkg Q@)+T2![F:o \XA 30[ fkofkT \3i5q尫 dv7Aa$h/VgYtDjzY]P:A  RXpS—FEFE 3pdcBj(N*R(m2Q4U #G |envD6 rZĈ |=|??,LHd21bf3$ḄrsXD SϭZ r 0adwnKV"&-gTq-Mz^T&BB!w4FQ<+GE ۜE7-]!p ަ͡SŃak&K{YN^RV&T1%(U>0 ]xvQRQ~q0Gh`HLZy=[tSh9ph0A=lffd{ F׿՝[2@& SeBʌ RG(H<Ec;dLuQZv.MkX̴~WAj"=ũXש) (sj*]^USpT*ԠD(`[*|'MJi(c]5K)E7TYVS~7qAhc pk<^#g[v2o3ܻu۳Y'؝nɁʎL?3` 7<[I,ul4Cu|ػMD=cwKKcCkn1DP5t! xLo+˕<Ͳi_gnAq^6\w璘Z2~2¨-K̯A5KnjlVC$> `M@[ !ƑcJ( (+S" {m <3V45U:ra3pJ.Lyg2%.{ 4&5fސ#I?VHŊtWoyKlAJdZ77}b\YոQ(D!d!B^]ƐC!d0qdEr^cRR? xGWk鯂 5 #%~ѤeRr 5t[['h յģCAP?DzԂ@uCtQ *Ƕjz\"k-֨fWl @߮<\m6Ftc9밅$@(X:(."P ` 8KoFD ‹'y0>߁hkG7H_F-2߮qUuJ^ьUႝD{ʢJ4ޛR2őXÆ6l6_ ʷ.xAeH\*q3~؂TC>{f ;9%S.`,:1.:׽5tSnpe FÂne-o])"rd<-صt-]w֖Jyג6xd͟{8-f/$`2A##?(zQg!m {0xs`FW.g).g/pgg1p~ǩl;Xb\}wsTuqjoNG*<\xccMp ΞV6PmZ@J#!oMYXҾ\i[ Gw&[C$[(u:A14R@SDZEOԄ /[gEJV[5HV{~Lu ˅~WUn b~(1TۭA̿^]m3Бycm_Eη+rܞgӭFG>OLˌOV;\&:`KJJ^yۇZ5܌CWwe˖9~URTn_CiD~n>T7tѐV`ѷ<6ɘiDگsPbsV4?Rf* V{1twa/>LөN o;CY`y5ôFQKhPBρꈫp6#,j2V[5)еcʧ$ 38yͲvȒp/CQUvíUJn/=z[*#}_'Tɾ7˚iE=Ϝr8c>ʀ3V js۔07/YqyDض6 c⏯)EqAau,Agy Nl3Vۮ9Z>IZSv,e6uS1v<%xLlt]׺mɃ=;gc}́GI)8U?Jz:I4jt־4ZuijaHZ}U-Tm7ʾ m!۪XژW^p.Uee\Wk$Jm]+g"61c&bӨ"FɌ12K$"7htYف6 VÔ,oyT0}qNb27.>owŻ< [1A1?XLhATӫR"ͽqem3;&ˆKuI_0_jT"P:K 6 Wѹ#L mV/Yd6,ë"+'Tl @YQ<| H\GʹZ %?2^мQT ߂UW7U EX2ŝd(9nFdxԞڢBA2lD8_W y&ċ΂4W wPO`s)oPszA*v̋dKR)E?=NcتԬO O!89\9H)<;J#pzU_s$Z\ٌFVʈ琷oN%msq8_Hu~ C%J+q4KA^^FMLHijTP~ _LJB]c[~bILAO XR^G&i=='j`|y a}GTx-g瘺<9?'TTxЃO bGj5݃.o8&:!Ɍe*c:f2%2%+enr4Eovk l&sji)\@6?W cLdk+v ͲHRX=~wo(PZʊ;bl f4j˂KNd VJ SEvwPgw@'0hۮcvf 뇯V O>J͑W{GkbhmLZWjs PlV* > VUZ}ͣ~ ~xZ$ 7`w:/h:~u7 !#1s!!2Cɸ8ϸ_w/߳G_ /r_>M5[,rb?^uCL'ͺ;?_u_@~.iQ0{bW]±bZ0e$@~eBT:ќCkknFe/ȸ_\-[ڪ+%) EjIʎO45Hqȹp(1[ ׍F7Z"n &p) Ѻdy"|c' y`/}3I_Oysʣ2=9cN9Z<՚jt3j!k{*HU1|M2:=)څ4 ͵Q_oG SXHQC<2<|[wϸ8Drnmb >"ǟ/=ɤ=;#m9}4R l OY35]w|@p܀i";*7gd 噒)it R # Jۆ4 L0EM\ !Ezqx(.x,?D3JRŔH꣓.J)P2ͮ ܪvCYNeu+UW8.ltօ4Ό'CBNjݶJ1IQNi¾[ ul`A^hs(*I}Fܳ9$ݰ߰Y⇋1TԴɲD2XoD(pecIUZw&Kv"Բ'?i=\W1?y@si⓻ s.Owz=ЯL`2jΡR+v%/ؖpyJwqDR:SapaJqT9K&5h@*D69|a](RYļ.^Sűj31g&fȨK.éфg۠_WXĥ0˴d*Gp] *k~Gp%Buf KnPD[SdI$&Գ,Tr%7Vlb}l!_!SdmYl5cb$- ^ //qT0*kؤ%=\Oa$e8gbQɻڻG`h^+sAjk)7t6/;9wD1m<?zOAӎ.C{O]_iq1yqʹn$7ot3>h}cgsU4qME9S~5m) ڕۛ/1QRKܕ+M/ִiJInQ 7Ns@L0ŕTM?!.d%osjiI?+Rsk2&HQ2e4# C3ΠJ L 1fB )vXL̬xEpjN2^Oߍ?G Tt<nXG)ȶ *1Ž ٦#zdiCyi`qa [bJCiWBO,Co`*Er4$y 6r $؊ 㥡ˆ4 ≵DQ!v+RHf+2ԜTCcAa*Sb)=N2dPQ " LxbJ`6sQEgL9HD0,hvrg5!Z} S6$*& Y#RGg C%X&h&^^1~rJDKvlD30xRּtlb Yh~JNUt\.9K ry$o@Կz3E8}ӻXt.Oq 6L[e3pG9CbI㷓0 :O%|(u5CQK u8 iaPkcO7̻i]sME+*ɷ(f`$l2[5@|=$\d8W1ULsUL3o44sȡ,c1Ĩ#jaD{GfF,nvVaSs#}}\ǣWsKEM.s%~ 67L>/t?Ir+K /> .j2&Ԧ \ƥ돊|XoJJ/W~_4R7Do<;IAB.㬤jM٧*ڗٹK7y6I^nPf:sNJIuKB-Uv|?lBN=jdO1X4hRtI9m c,ˬob,;Pky$ȓ×AI<|a%5>׌j{AUlk 0l'Kp}$TMv, ~dz>/=qт#[j}zߞ%W.v}|_7d¿37+pK!tǸ07rֹ!EMs][v\\Os]R3Gø=R-7hrtF!8*1#Mstx#> :~I!kѽ캍ü7l( 5 oQu9%X\[\z+Uf= fjq7!03L VQ(UcjRK KɤA*3*f8&T"@Q!gbٵטG;Tםg)fC6M@gr8p `b)2P*BbMv:3N1=|h'xCuxpYKAM+TT22g쉰(B6"h&0Iac` Z%Up]%`/5W/fIKe=!Ԙ{|sQbbDawc-Ͻ> V` J$F@%y}qũE~χ_Z# &/ڔ`MOhqڻB b?'Kyr cA֫-g{1L2QGA20['} YPh*1P۽E55*d+];毜j\UWڠ)PuPsn<̷œ0&,}ѯB%#a*AE(Ւg+'ZkW"S#.1GY=ΝVox+q?|x4s:;91K 36b1U0K5AyhF ht+Z-gZg()' T|bh KX׎3-AX؟j*,"RTA$.V~PXy$ /?Z/SL,sF*jM!pB@cpP,p6Y+ȓ}Z-g(7iG2LX<J8@Ag8n Xc2żM5eߏʗa%F%m#՚cR,0[X1gڏ~3'!/{ً{;;9';~Wvt?pp4x{Zg2TcFyظI?O$1^\»E}9Er}9@^z~Pv`#p<Я3 ~Z`嬈qMEL1vS*nU1(#:UE:<-D04*vCB^V)H-UF.d;vAѩ.^pߴvsvCB^/S\Kd+j8nMd4PXhǛWwfzs{Ɍ ٛyoPZo}W@i/wsi๦XnGR*GG-8jM%^#Me"q∳H#l_x.sߢG+KIՎ&s!E03',k{+W)t*Z9e`T`g ^n(>&$gmxSs{w#fS|+J~]d:g,ug:"(vr{ƒi4ֽd`&f01X3\sBT/q~8ַcIJ(V\}tk+˭[+lAX(r 2Ӱ|e7v`'E2N 5" FEe  omkGW=#ɭi {E=&Z꧊*V;(@6xAQhd T(* V Qj1866Z,x`$o.Go+}rn&%΂>23pQzS$9$ x|5{2u)E:N>ppmRNlIs#$8I R5 Cjj#G$2eUb[Dtp*&JIR2ur20FGsGKABj),YTػYHuD1FluǶW!?iknx_ˬ ']AJ*^8$(@ Z9՟ NaMe8r+.HL`Agc Uj9֜YG&Ds#PĮdSM!`$z`fIwT"'=": 3G5(t9$h3R6 CEK9jRn lot\Ymd~Ʈo֟k~bkazƱq@)rD#?jdhiлkpىab>dϛEKK8?Ce̔1@dwk6(rc;W +D-e꿝GRm{ΫJX="ڱS8 e;dve蹥lcP'4FJ3ڷgR([v?7Ur)i/ACVp~SNS/@)jc8UrI\/A;N= t^];G-j, g< r>8mf~z&kx:aSu#,ԡ^ Ƨ~C;&5Y(Tk,tW1g*Of|v]K)vVC KW.;:3*P[ XH(t }c#| ,TcLPmÄhmCTuMOJB( yI@kR1 "ء3N;~\P⏊*U>'40&H{5N~ =, ]Vi^PIuDP^l &]n 59:r³}w{{2x Ɲ%Pu>.r=!QMyUEouZ$d'4Ǐkǫ2t7_2V@!R %b0!!BuHu~tõ"x-PrB=ĠtO 45lv(N{?_tr/z}q%qp:=MzgvNY?ORR,O7zo^Piu`ϕWFs_]\63xerNӧ~8'])Oڍ&RϰSS;]o;dxJk$Va >Tgn1R*ckTZ).`RUu*糫9 :uOk* ;%\F0C";ç)a^Jhg4(1e^X嵷਴Pp!4^ MEeH/v&47f 罌 0d:Aۤ lt F/6K7,aC2BI{&g8sP@B!`B!քizmw'.<r>BAI9r暕tpjsjgHd;sµ.,+.I~5G[P[T(Rs.s[ MQsRL`2Y㤟 ?cc򬗭&goI~p57BcfU\}p q6T&]w{g*ϛ2ݺa?yߪ)RNΛenˣrq)@lt/:On,s~˷'E}/V=B)ZE7X,m]h;tM]μ@1C>5{JZ49$uzITˁ>&Ø~z;- ){s(Db$sU*Vdj,dmk+m\ZKFY<$L}+~*Opx"jTeÿ_*]VbR,)nbzBO@_K>?Yd+x]nF` WYX u {[G|6ëL+ 4jADb6 SKYɴ_N~CH/ Vm*~PV)岺K SX K@ZBRg$2(_bt+8G楆giJ]4 8\xMF{Kl:cVo _WbxYodNG06*.4QQzv$ETysB)I21DF o6)P&U1$bmesZ@ʶٵ=??d#HNZu%]ШJ-,CDՈ tU0ԬTWOEclG ő6x "~*)?g4Jտ7݃' [27ooۻWm.h⽽{N׈S BK1JcZo _W* Z;I͊|te6y6,?^ h)e@B *LB7] uA ]gG ё;,PZDY9d2xGK`]Ctڰ [a.H-[lG&H֧ 5R6r2\{GtJAmԀ 5ۮFrN l6!*c<!g{` A[{+SST}ʿa*]_)uNo7imSe؇ׅd6[ => dŞps ƊxǢ!G)>]Li7 kifp/tѠ"Chyww F/u\(Z7oqd/087[/< E%[Ӆie>=VaըavN]p٥_z2b4%"#Z,ߗ/+a<-%IwL\8yFH}[:v fL0&I0)f0e@#[={4mQPZ ґ'dBɧ  46%SxR(ERkn| h=^)D4Ǝb/bZ!Cr"1+ _?}bH҅S,YCq:y9zAXQUXPyB)E֌:D߀-5/z 6 [>E'^ l DɁw@II󃜟@ BE#ܕ=rW2B](_ ?%# x^hO:D~L - t]pz7@BפvI`J2 Qq(¶cOVQWH,OVcsfZXr}x"MJ%G"rѫt񲲧<J1sꪭNdUΊ8JA&D!%t=0C!p*IG.ZۘUOĨ::cm̬"Lbk%텖RtTh9P'ŀAٮVZ@#+eJ)b@8j⽏T+ wUJ pvWQ7|ω?gg),fg-m)wHӯZ8njBw<;X8UjAIv䨌qfy{7DN<-7@g"UɑPN5nH7Gcw3.ʠ0D1IYMFaJ v*evŞSgZ{F_P w͗, I׾4#QMRM;qS-]dBƖs5 ˰f42"3eZ`j0B+ "i).x'X!pP H_gi6m:;TZ]}@ML"i>עK8b<# Bĉ/b0@:QumO!8 2' ک bر!e^o3]3 <8)/[w[\_b5^xVs낒rr]2r"₡]w֤5M6]-n6ub?/ n`FxԳMhb 6[- %hV] J(t]Qw>p<8wCQxų΂\b4t_k|uB8zp1uƚw7T4րi!7(pt!F%-"]00%Ҽ='cȜBaX&~W7 dN@g Epn%5 9-2FV<=bAŖ:Z liGe0o~tqYG_hy  /Ϋ!$%TyCP!L(t5U4$ }axBtiz;"C*d>TLs])p$#z)#^ʃKRKI(%5IUY&5o7s&lZDY: [Gʮj퍞@U;3ֵwS%7]D:(!J߆J-<2ELZ5XgX$HTra4ҟj-j Q7&1Fj^/8c)Ɣ댢 R ݑCx'6/vwx}',QS>de'Km;4HL$sR&(jYt:woBh\5Qxk^M֣҈bTK-v>\C~F~o0eu&ƼՅyh`4OehtQ4SjTF<d:S {׵ r󵑉,fvpLxeh<Ѥk(8_e%6+GOfe"Z\bQx;uԞ4sp$K[pYEPJ hKB\$j~h\rv\J5dWL~\{/\nypk;YJ4F ,0zމu}"=xo.G;Yv.6 PA<Kp2HCtu.gEQgWh2Zs蠢G/bzk]"TDc8)Xh1 q(*'7{Ǥig&.bL-*HgDzy /m5Aܥ BN@H %2ZxMB8/~Lߖ6h,L p1%PPrR!t)ءa-49iluvj8Uۤt.'펋 qq,hI w~&@%7\.ho~IrQ !H#%ϸ(@syDŽ3P(k?3Zcku茞 (MLFu8P*z{/5Cb^՞"k._ϣŲ7[S\ a[>K<:喳,7] \c:!=Nt. \54C)5 }Z7gza"1:ϨdΖ:-&غu>+Ժ50C)ZN 8i=9[Gc-A|M>|րfs9k kZ%OC"% \70da ; 寳E5WK5դNOS_r%=KkkxV 4W/y,%GA2pq9d\:WC XN[…i wD{Krj`9ʄ1!e9_S(*í_,o#+֯f~p;AHb'BToѷ-?qK&CXf{k) 5IS"1X,Ege+*5fcd8M[!%JXҭh3`Z4#[jq4(o_ O qUB4Zfܭ Xe\A972\J ZSLRN)gF r.e)!T2~vrTe^6;`:=?{cs-WOvOϯh,L2Fh O>?d՗XpI%,An )r n.70#0u_K4wЎi DRڪ70`+ ;6|IGݘk_" h'mK!*&*dvb5SJ^6aVRMHX)#Yʜ2!@I$>dEt%YP`bfbWnN Bb ]<.B Ne6Z5tV.%qFjM`8@7Z@p*%XXyfا[۪ ZSCHG_5~ޮrQp=հC"@H_=&۶@[Z Ŝs#('$-kid. gk΀wRvsVw>?_-JXs4Oov7WjZŒ<|{YD#/{޼~f{\8A17]bេg?_ |Ģ?4oܷ~T,KL7@L^}6 $P!rHBo ӑRD%㈙C8GՇr^\BnPa^SHCMG[]akDAH<ϐsڞ~rKm/m!AHܺ [4M "A_fWoö6A4f+jeGR-\Թv_Ma"QvBURҙ9_iऋ.IUH!|[fꘈx*ܮVq82SӈMpNJݎ_)%Ov/N+5un 7o[ǒIe/j'PX8bB5o+ޔy/ƺ\%y;w;?&.6s]O{D;G+mibPaRmKqRgX w j"RR4S2cbE_oG&{\4D[AR A_0:Sb1ɨabc)0:P1 SFP27ˌ) I d֦Ц=)naƘF")*dRqf.ȷ X01Tkgߠ ^V]bP=ģ kK7ljSo%N8PB<$ "ou;̱}u;eყr!4nɃMˇ/I ,P`vqlJ&|~ϿF޽{=.sD07i2mIJH.e)4Ŧ j\ΜMWvX,no!"ȇ` sߚt܅xBۿN};ψnFxԳM p!/]-H{C)%ĤS\ }M>,uș q|z*N 8rbt5^꽉dr5/T8ǹ{ۉ> ^[7b~򘻔g*(*D^2<2 ǚ;ѧ/E`Jhd|=y/d#Ҋ,*SYjg ))rX Dt><*С6Ri;aQ z 9,~ !:O'RRϛ |;k4Q:txRwP5[($|$ ݔ<2\~᧐yxԓ,DtH8 6$9ιTAJm-1AnZ&Y0ŤsEQj=+)<]H &嶦aW«KR 305 !H(A31URPr [fBSZ]ltK }NmblS-Øtu'a BW=x]k-zqEAW!=L:!ΗϦSg|jg[ܒh^u5 B˯g1ֿ u =)qGulHgnzy}5i&Rg 0ּh?F$1 tXQwr(#Adnd7l" "V ^׷F'%N" _ IDaSu#ʩxZWO.4FL.=HA<2X.u\-E#Mqۻt,[-q#?eo׹v!~[vٯ]Q,iƈנ W mB4D@BP ~ x;Oݻp9 ux(tOIdl^ BBXvA=U$Iދ*RS$oTAf$ٸ Rxg}_Vc+"1$% KZJ$E/=3XGsWuuu 4*vX0A:0e9eǖ z, [v .֙@Z$s$@LHxM{ ϼ)gZŽ Crƪ/FnX^벦\ )Occx2X@1{#x\Q50$B\XB 4p 8R^t9' _d9KmoYjxlEl';P %" P:+EoG; V=V{qyJPυ 0>ǧߊŜ`2}Ҁe 𧟆yp{֣۳_R z}^t6_h4^_ _=_όl7 1<0 kӽ?5%ja䇮qp*3Ŝ3.V$00nB(Nv4h=G@TE0NV1Z68x(I<(NܘJp/6:}6R{tM+%Y);W +GzT{h zRR8nSu@Z+c&HJX6 &,zه`U 51D\p@s1`&2G2%ض);Y 3_NPrwjVcZʳᝤ̙mN0wbT]N6Ԟ甪#nۿ}>+o+?r%6=/NR9M*1Ǟ݁zoMu=f.y,X%>&9(\9)^ң0ԌnD*n:O^0_}屑[bU`Hto@@RF@8k@-uKQ̌dtU:H0EZ*I^h,%1Ă- l(66:-7:LTb`a8U``? AQ&`*1"R$Vԉ*Cx2C}Q6HCq0$42B`v@.A{' R tFKbe k0*s>C@ o'}d(Ir&&KM( p OңIvrE¤&=$*0يILBko;]'['icaMۣ`2E[q Q{==L͞8E%/.>DR歆dNP٫XTu'KJ7%ŕ KKvZ4/Lu+!Iq\1^y<Ş_9@VvP6IW8x! ko|RGx<{u9 Xa\Y/Y|td6<Ȋ+j8x$ː3Θ{rJAci VGsBUj_stud!qIHSi$yyZ)Ms%ŝn/FNh,N("jnLb83 XIV'n0zz=紂Pgsܓ89O~8M.C+p㙿wTnF<1%UzJcJ|Ln^%^qU+hs,irMvD]#¥.CQs1Zx;ӷ` B{FAdi@z!.=ت]U_lGwgMXsԊ5ל;]ֺ=Ѻ=y1͕n QBޯjL)1BkLj+,FNӨ+)WLN]%{|lMY~ZmW߫շU6 }˹QDd 7f %_r\֥|-ẅl{^$ Dz͡eDO9G|1!1Lڈ5's5oxI%0f٠fJQ< WP\W^PNl)$h)@)^j i].Ti&7n p{1溻T>k.q}hYgwɘb˻´$"$!iioZR:Uq<}֊Rfd ޼i ɷ{s]{xܛ[v]a0791̥nIfvܛx{Vv$ap7eDP}Dl6/Z xu $HKnKIZzaκ=rIwLI:jjF}xE)^R-nֲO$Q*{XIj4ݼj,eD\a:yC9i'Rj-]*}Yδ=)ex2C8JW=|vՓ}r۞@"mt gc bzcR&eQPlC34ťr٫:~ՔkJ$R (ݛ^E\XX"gP,K,VBSS Feʪ1UXb@b>~N:?Ւ#gD{~~%b*g 2?tZ[0*o.!_%Mo&Pg!ԿoX[Ez{ Z3߬a+ ށX7`՜7w d(Uus\ʺE7+ښ/ĩk*Db\RWմK֛-vD̎x= &&C<>&Ӈ#ض3x{uތf꛸Ҥ7wN.Q,)@KkS4p+S/'$HT9㑦R~ a?o"RTB ""Bh`%.Ie"E{*{hh{-|0f+ҿ+P7?GA%nL|YW7ϰf7n:y|Xl-g 6|҇]A;wu建'D~2kF6Θ!X` BcE̬`1!YzHzC0B~Dw?|x{`~0xOr+"Ł /W!?0*Ex`O ^ Ӄ2reTs8(%xlAjAY־Ж9 BggMB`rXb] bAŁ (*bp%5r #E@\XxPG(EڅnɬbR䢔W ;-I/Jf}ªRT,Tkr b]A5(ڨQFV+;i8x.*q_ S(lt $N0SnB2heI4CZ (j ^ Ic*lI$ l*f1 @:%ΊAS .jK?{Fr Mt/`qXnb鑄Pc TPP$JT_uuWO]NU U^jfMbu}g狴P1ŧ2wep}]6k0X~XBWb ]qUGĤ1almF*-e2K$`BY f}rlFe0PPz@W ^+o!5%f0BORf%>v0@bDo4'RG{*|3)FD [$ƀgiƉ14C6 %z2 dS ~>*Hb"ʼ&ܖd')?R?$?h F1@rqK. l)u*\{? 9e6Q8G]⨋rqӋEj_pJ0+RMH%N)R K(I(Q14ՈmpFѮj6ㆵ>1*T lժa@Y2L5&B `6vq~]UFWDU٣_^;(\bSKG0a<:4Vg\P ;\3R,<=iSV#fľu&P<мDT1)f_iR▃S}.%yH ZD:8ݛV?Hx_ڀ.sO \V5fk_vg rŋ _}{+>8[𓯤<O 4.a>c7OO/.?I?_\C,Wn&t _b<*M3*kѸ% {.=2?}!L2b=ljZ&} e@qJ*&GcyRxCVh`%ZkdOwfbc1*I NyInN=MhwQ:E]릆ab1J{aSUw(C7!߹)M.9[AyA Ʒ9}Oj.,/}{c@ \ 4Wg YqÄ9Ct|βTbL$MKD h9aUGÁ5l& R?}&_]6-wL 4b}>gɎR F(-.͕9+*16?\Kc NK^qGT/z/M L}XJ~qKw+e:yH&^/RJ^MTD(%yH*se $\c$1Lɖmg2/3BO7p"-}>ڮX<{yjW(D(mMGzmnmv6K|IQҭF[/37˦a%XW}0c3(lIg*c#2[݌ X96{{4iSЕXF8r, ]^F0vi V]eړW3~vu3W0 C&.:j\[1$$mm Tpuu{x]sJCq+Je53ZpZ0Sh86g(%{=t V0-z%K)JXeFa%u#r2uXQ$R AB[,y^-wE3L0L}Y:j5݄Lo$#A{R8-lyrh8̄)9M]JViB.U rf|BBJbxBܢ,+u5Jտ{0MT UPV%0z0UG?wig nrkA` t⟴{z՝b4BW9e!5C+?W`H Bٰz/<Εݳ#B1WPٰ(հKV–/յYy=Dѝ7@5:zi%W@H%;=o )*Mos,.ʹEX WHyb HIeq0)&hpbFั: 2P¿$-!a? eM4eQ`N Kp$Tf踪e9Cbs +!|: b Og$^ yxB Z|FynrXS$9 Mha3n Qs PTЄxZ f$˩a>UJh(Gޛ .:F$*h4ţKM95/6QPPm̢#Va77TָL;2RpUsfbeGVH] xɷ닂w1p ' eI%I弽(^J_;䥞{dHeOJZ9Cޖf`D:q@g?s>wO_FHpp&vS46ĖdR&ђsƕ;q.Wˋѯ^m<{Z{ a*ШT%<0Jj (&1#ACy2R .CL`$.A;RsRZ)%0ig)FlJ0HK!EJ,Ms>1KAKHbqNdg\HRY9)8ndr.\?EօgxST - fe2Z(NLFR9FN<3 V`lRǕB˷i^.n ")VP7/&ְ zzk]qNTJ1 zehA8,ZJJ3*|rhkNMYa`т޽7>h< ”́Y:d} 9 +ȸ5Ұ)ve}aSE Ϸ^STZŮ!# !=ݓad]u ->no+no \xM;UW-©[5@vV} +unc*cd* ]ǿsxǿ /FC*1w,kK 7 Rwz a6(-uBaQX ^5 [rzU|Zj(;V$c[׿g"BT] UͷUp}VAxfj-T#^C S2x@/<:0Ӕ9BjV::+I$RZ GdrOob'>OttST}2Ev}>zɶ𻽾[Ï|~Հ.#QHJО[,Ґ\EtJaUn ºb:Ϩb:tWnmZ@ք|*S~c(-=ZX BT'U[pUg->H=к5!߹du=ZX BT'U[g$T[|^IukBCs)vxFL`JKR?L3"L~_3#F.UP~Ohr&!JG?'41@'Fg,K$a'&ge̥dL`O p r~B3Al|~*= G?%< G?@*TA,9ee~9;KbE۷mtT&hof am|V+)*z'l\:y`Zjuqvu{,;$1D' )JRO=TA͗=mb L҅K}uߙ-E ѹ|coYaᑄ|CEDQ4e˥I{\85Q`N#"FD "\+[:,P&o|<ٻ6dW&# aay9qFFQr,O ˈG3C%.^3c:U➺ѫ:5U[PiY-˴*EN)v`>ï1K"vM–-w<)Ug \j ZBi-*L]Z ujsV/G}oTF/E "ݼlcK q#6ke`ZwPp/ %-* ssϗf?fb3JΈ'#IaPh/ۥoMȞRj'X k ޽52 oz'tvJ[ɨ BWfT쏡B%[XDOAj~XTd U4t~C A6D%hνv}պc0{Xn9Zt7 hkL&搑XDo8;,Ᏻ7&Y*-irUW^}qNO_UoNOEɊle;%n+egre}zi;qLzyahu(C4RX`ч/R5zWʝB TGe2zk]R(DYLPz׆ hL!2PrВAkdn$HH/0[%yJ.h!rQN})A.;j瓯ch&HȲTV ( -+ĽK75-NtUߏ_auǦ<L>`ӛ\)zP$%}A|0A9B.r"Xb`v2t‹ڕ%h\Vqr=zsk U&}r S|Ye2:>Pέ!`y;DZ]8UhI.X^!I,$ӇjWVX Z3o|4JxXtgj6^_|$ǫKakY%|UNy ^5чw8ձ#7#5%SJf2e~̤+wvq4LnmތH?\Nn}zz2;dj8W>8; Ii)MysVGԧ)I6*֒98:8_1UHZ %Y5*d*|~=-uGfs(E%Sf`z3μbf֑O!88ă6]7ţǜoռ n)kcY.="ZC>,'dCD0GXg_^58&{Åa&םGZF?$wy|xd{eK &kg6/)so tPf=O?t2vً~8}ǓowzM]Knˡ.J-z \NcCSBUNqP2KG7%4lX(I#`X,Swlio[\5P|i +||0*|9NUú3ڡ ۫b"YkLjY jNyooAa]=m^GWci˄2EF/)Z x_:MH4%!G$t}u8Gˊ/mr%ELZ]nP uwF''.sG& 2i [|$uPA!YiIJزΰ(]!bRaϩN+Ѥz62tW%#[klBFXPBׅ Ң Hrjǻ9C/͎Z+.okcŒs$s ;P#SRVv'5f07_"Q ]+G9!87Cl^֑Lġ\zAk\r$|Qj0؍cB}gX笩 k*%>&t4:jDhzn&`JzrNL!@LM'c<,2 h%aq0oIfJ*=$snupŸ' Hey< MAMOy19`k _qBsf\Lׇ qh /aZ6(,.$g@s(+> >ql5m]ďB_žՄHea"-h7ɞOgכk]?K €Vf3>'r;W{hh>xCvDGiMdjas,s4x_7rZot(*!Ő#BYY{k)]NMhל _o l0P(Ya҅`*V+x̦{P=)ck^1arC@"Y eHu^&X y*P9S<.]W79o% !a> ?iUW̱ ;'g*l rah*Jm.7䁾J2 UAIMjڄ_J1y= dA"/!nM@/³v{1/zT/{e(ӟ"Dλ;46JxFBr3kP J e8 mkn^Q! Nҕhwb /|P0÷ hA^1-y?VN ;9h`9]O$5՜N:ŭL;_hP M$: ]P= ւ5tu!=78q3_ea !LX Bo2 xr&oZ{*zQ6(O']34 a.<7 .maE$kSx= uZx׳3^-]xBcY}bq`럹We( / @,|.IP~ˀVpHϘc)75tId܆+bY& %-vϤO3yG &~WOyr73PcA@m6q7k cWNӉz 17X?YD_-^4?0$˸[1NX{ CK2{mRv~df;Jl=V3>V4ע {fu-[pqC\۔{c ap}u>\#>j>0zVAp}f1 Yi7F|mGEPuGQĵr^ג^r87o48ӛ$ۯ<Ij>DJ˾3y̍@0lg %nU~ܐ*Wj'/ӉZI)73)C&P|:oAW"kGtqusAt\MH2)4+=yՆFW:r8͍G3In%9q#xk̀MsoRǐ >0bX %m)kqz7'aBc+FB@ޠ a zOfzc>R Z #zQ=hFp%8 mn@m7W2 eh}K(=GgmK &eBG_2z -9Ӭ+Թ%! ;po5YlZ1aZ^Mc STF,Rl(zơ5561g9r9P~92+Rh0ܒޝ=M.ż߳`R|2c+_3sL12:yB\ YQ. I:^x >EMy{LbR/AZ]N [ ֠u[*(;>^]ijqw8n>33S<~GrJ2`oG&!\GNM{vq4d[9џ_?'N>=%F)/?Կs&W<L>]|Em3Nyh{탳q477(F]w8.ZI- 3iD$W#H"1ҥD"eYJ,:-H@`Q ھIrHUkDE?@Fs -| Q^M=; Lf;pUf J('a:J8 tLL(*JL)1EU!uBYm%/$S$X-[vZZzr<# =0OĿV<<'K^'YŐ}" 3Ӏ 䡤9JCn߈-6xh`-bijK&\0N6k(-fdP0\BdwZ@> 腴*IrǹQZb3e!Q\:ܨc2W!&HMɃ!<Hf Co}u!D/i'f TdhE62M> [4r FvߡXZm8ER$ m$8R"! o&ڝ|dL&a%!m&ȔJhwOnH_Q͗[~Ru+>g$5͌fVn%( 9I&zPd #ւZia\\"%ʂ8lQYQXs;6/0Lc$ӟKEa.BJ[0x^N(#it$`sHirdTQ[J4lQ@$LE&~|;Ճ}Y;L/է|MN13ζo0w } _̦/_u=w՘^Q1E E ѳ=35i X5J#--Ӡ[`AD\f & :w{âvdVPplGwݽlh,;{walT$\? cpg߫/)VJQF{V%aHhбEFb<:%0kƛr*DIyMGF. +S4YԥP屼K%t7QdDX $-m oM1"iUwysAM8Ѿ@irOħun3G)CJ5(5IXk-CQΨ$gEDua8@.g6ݔ`Dv3촆l^h^ʴV2UҰf"ֻ!!5MÒ&4&Y@2@`*0!dj(Xж6ݔ5z)OAS?xCpؾ}5:<FYC' RYĚIU9ޠ0 8Ar$憡YܕZ!u}י\vKl:uA4s{F|?dWҡ2΀P"GZ;`?o#Ưzlxl5ca;RXE{0>@sTG4(G(zgysqt߯bw6_s <=M5~r𱛧^N] /:+~4pe {oX[| l&)>i!ڪb<^%$vmSG OM5ghd,Um0m%$Qz\>VhS;Fm,n)d~[T[vyoYnߙ'觋88F` NZC:ު[үʫO",Q:в̋Bǣ259>G>ICvw,7"ؔqcF3Բ > Aݺ%P[`CfHQ ŲbQT (I04׆Q뚈h9%t?iO%zϐ@DzmI  ~dǎA;tF$'8 ("1y0j]D9 b[`on?Stp3B(4ǤYs8fK1u b~wwMg^~C2侭^Sz,V#+1sqc1~eݟ ɞڕ<+hv{ucoNx8ay{ r98h`[xSu2$n_<5hǢj죚5+t I$0>a@jfFs$u #V>MjBDl>`b5't6_\O#TVhSI}v.]85s nrTvc\8qKȝY}zuYe/6~}b+=jVS+Ӭfx P^L1/UIufڝC@Ѐ^^=6B[|~3΄1\zzw}rTeJ>R ^}.x8-آsz{GA2Xg|.a}x1e /G> yMG BgłՇVJ@};CUԬdNyO ?Gc{KzJ!c#n<&Z/;+Wo2cH71^HQݫZFEs'ٹ(WHOW}'78TW'`׫WH>]V/^`]K,K%%{:$u#zfKW! n'|1埣b]`-JhDy[՛kB@ZQ̼'f4Ssq0ooٵ} 5֗)0<6=Kܱа$O(cȅ>*Uގ>\%/ 2hT8@#˂xPBBk< ١.w[?y&ozG{x;reK)Wθr3.bv;88+Je^kbu)gT @s+Eݍ}Kⶪ\P׋}*%w鱺_[= v?gNVaYΒKHl lP\r1LVHLa)s)6 iuMC5 $(`I:& !3bV2NU`,b"Ja`(pG1mBFP 1};!dœ$*]|_d*ƱHk7# : U_qzbH"]r-njdd r3F cjR45D Y@jWozuۖ7fMF 2p,>8Eu1rSk9qKѳr+EyK ,pAQS ((|J^12~-^Gl j=D(PR}<›;D!3*e&D +dVhFby&"iʲ@3URX*Y)&R`bL(˜6/f,iyI+AËf,Li)ed(0- $6| 10"%rq&ѥ(ru)K,ܘ!]K ~klpqU8+p8]?ݏ֧])Uq۴r׫8M+p(+1\=)&~W"DlK޾~Q`Y5: 5׻rX6ʻw O^V꿹6շ|;hQ wܼ/ %AJUk::{nql)FDjHwya K|ܠZ  KB[uoN9(QM;z jё+4b¶U< 1cR +c^(-(P=:("! JI"˄4 oj, *Aw7(XsK2z,r :W2" d (4WޏLZJ`ven9t)A*,`=ͨɧ v"+8!aVe9U"**rXsL;M7ͅ0 .(K]);ƄAAVܻR".6f7+y>1M6"$2HUP C0K_,{1i-JZA~ < M'Q95МI°iw{xն#Rq -He42:S00,V)#;"RкA$K<-$korztlrF[\Tq=<ܫ;1Mt&1MvP՛?suo㚌TzPָCp+P˟z(X)2rH% e_{"> 0L}RC0MT;ߖqxD IMyd #Tfs=Nh`kD`u!ywjRޅů vx@G ]{Hc"Fd5wo2Jlu 7sAաqoR7z&NR* '8l, -z/`A+n5u/7ƍ_Ƹ͸90ĴœBUjoLF*(t4PNP2)4;Bʼn"VА)B"TcbaiǶgf!CsT! yZ5MxNU"N`vqf%(:\aÐ3sb%EEF1* SdPfr5rb6BkȸQze kͧE N B++CMd4YB6L(E -sI[ E9` k3de% RMg3cW]=eVeK@[GgĶAPSk >A p< Tah1yVh%~gVҧAh JƐm@3SwlW=;S̈́e>`߼yUwPa\~ R ˎؒavV`-'{U1jEê*\"1+,wKGՍY40V2*t USj5:2eO896Q< Z iIeyRK ƕӌAZ|`+Mś.~2ބ -5*[4zpCj!O2 PIbѝI ]| >T~$葌Kgڈd֝"cT6NȠQ'QDhCdzQ<*ϓ{ dݻ'/WՓrq{3\tGŋ楪ABwsbE{W;S"mGD՝X޽Ƴ~~@>7 PM71_AT+f5H\ܭk,ڔНv63vgVvxu݋ۅ;*oWV]>.3twS4{~xyXook[bxn^ª>Ζ1qZM@zyp~*Fۃ_6GgkY`>šcr-~i$.Q2PfbuGtbPG֜}nGݞ[ hLqq~j%<|ҙPzJyOxoW4x[nFL!I•@WјģT22I0M=SQq(?3\BdЏv1+lq{ ?f P^TYO43M*RiyƶP7޾!UeO9Zӽ[evN<!yܘ U· nF>|\ Yz8W1@D5AѰ@(<6.xYs_o^lhyy0D"ѲtrpV-W=/~sPr1{!@dp@KN-Y!DY/g=x`b'&{% dmg=Ė3 (vKjIlՒ 2c,~U,V@9Ĵ>2/AVr]sFM )ެ.VʢN 4_.3r'A7y̏I`l̹y2waF u ru5^~a-'6깐|,'?rCq Toy*S GsNL_9[l| 04`vo*bgW0Nz.B'_r!TFQCd[бtʉϺN4ؠ\V~pu.~۷xnsNҊsZ98{V)%H'I.K'LYJ8pG ],PaJR)«b:tN* DP* UI+"-b7>0rWdI–, 0qpVHhWl(NUmdk 50X46~nkZ$q&*R\ٍ?dI钁\]0*3 q}lDq]<8[~ӫg~nݻEbgߥݢz n:F)oZD9AYQ|*[\nb_x,22Ab)mE\]iZ DʬbC+{fqR:IϵRŔ-wjβJ >: U]( i7(dyI f)IiáGb#MpI@uFa&xdY|$c5J8tZ\QZaٔ]̀Z_!_W>e4m qB. +t%s+dbUR:>=N썩99]N9c|UizXYn36T<|K|CVQp%:OFw!iO}Gh%Ijn9P1>B1u;ا"KF6}B7j<7g{{әJDZ)[J C%?%;.25Y[Jas"GASRX; 0FOd)3UHTL#FF 5y)xWU$`֢2NRZIy@ 00.)eY)v}$o!J3"/0 ߹pI8v|继/}3APWv>>~Y큇+FG?^ Чa 3;٤.,Ǜg)-f&>L]p ~noo(`l{t<#ɧLsNU >w/}±pf{Vز/{Y%b 1cpj44'm ǦrQ=Ø`.m0f<=p2f8ay3jm686A€gPZ￧m{oM;/6o=D"(_tDD"O)EL/bxSċzX'P_K#%eEQ6 9 _mR9&Tia>:I2HnCYD"؇N4Zf$ d4#8a&899ilB D-6^*Q1 R&rVX\Z$be0߫#ªBe4X *Һ gP.)w:GuhV)> mP-,9 aRaa-b`p!tplx,&,TV3߳E1Ø_hx<"ZsTv=th'ha\7xFŒ="( p:,uuŜ܆3\J.޿{G\YUBGCuTiZ̗-)'!#Wkv6}doBU[ 5i0ը])铠.1>LJLjbbjc_.ҷR|Fw6_rf|w|k_\bMz8\?491KUAK[{-kGCrH[+q3)1˂@ѱ̵-5GKYcwfkȴaFe ^!  &4TBjB+ VQr?lM/5LEG/U]oyW`jj"I$I5[>~eq0߻a gxSIJXb}?eccL]gmc.y.<~stDV%(33c>g̲l3w\S=op\iM# y&cSWYZ"VWX~D}5AdH+ ]9(Tlp6ڵګC䖊0eGx*Q[yOmހ`*MsqxI]ЍEQ]]XEhi.VMm%Вp/f@$Ux] l5iXͭPܺ2$_m(nn{/P`-wWO4<l&Y=6n 9':-;|bsD:g?r./_ U˃˃}C=s {M#%Mܟ;{&0cSF 8%v Q#Z(`K23:hac*g<o _w<H}]MɮZ*)%rWlb54a4X0b;1 qF 6OeL_D FGOsI)"Z_M/LcX56H46'h:ۈ~cA"/ƋWL~oB\CҪq,} ُ-9ZJaҊxPa.oF`w0pn~c [M?G JNT)Div@*  % Gc$&k-̈́ƑtR QמO2ȶ -UBX[A=Z Nj_&?4Ifm35vbuw>vϣCֻZ􃿿|4( WSɄ=ma1KZZ*`IKzxKKۘў{J[BkgsZtXXG$0L?6F A~swKR{UZl>Am@вV{ݳl^|{iJ];.5uZl+RVGpQO)iN5$)H1. uZ[ax&CUVK$fLVW[hogzCۘ;cr*5bBFm#N`0CΉ,]X+7,ߚ\nƥt@Aƻn9Fշw˿ݺWnY6*`ͻiZc}Z?Ι!n#(2He>54nq|8 hϳY0txa%ׅJFK蚃X~?=<Z YqE궅u@!>ݝLh˝C]V;if&fHރp V;>e3ԐD9 Pzj~ʞ`=Q ]FW=t=C=&Э1w<뽂uʎ}I.Yoim5w]-3W?Q-A&uU; o TH][H 6]P hjer= ~V"6͈{6r#bnO9?Mf{rAkyGdZ2)[$3Z꧊"3E,2 Q{"⏫toO0]9Җ> 6Aثݭug͟]5ŢnE u[r=Z "C9Frcq`Fk?hTsJD(v %ADũ@ߧK{?/jKw"C_ +,./˖#}c?t ۍZn5ۧu#ƺgWRouN#7[(ZZ&PcРHDPFըVyPSRxSoA Iףlޚ Wʈ8%趪93:@ ӊ;6a騮crVji #聢JB\᫘DKTRT* -Umtf$bQz_/Kﰎ{&n{6mw-+I!Yi'o$Bo4=#9~O?>b9B3!X:U G@jMK |g2iLqc'LlM^`@2zirdlvUr&=5 Ņ3eF?QH̎W3t'Ĕg&C"UmkA |rmݾ=Ojrzg@:R\t!dziuHk[Ԯ;;Dz_aNxEB+CP"LȁA^`9i{J $#)#h ߩGFy2n| &̷x=Ԉ lLPˁ jdj1aLQ2[d´JњPFE=Jo~3jb!KI[*%)DMdci%ЄCi`9YjP6&3Gf!;6eMQvD)Q6!O~2 AY?tA:?7x{;o|'+BЪM|E$R""_E8 Hpy*%%?=F7%jQbz_ջ ۇfnGR}0{ 7 ;(CZ4_8z@t"Vćݣ{kJurv lKJfV$x'L'Ҩ\(T`j$ b4Y4K,S.Ɠlj<5I dj+`={Bϵfj" d]ze=*;Ep|.RŚ%vwc(NL/)+*Ȼ2!Ǒ!(u+HJ ⅲ&5pDrL:j7̄JT;YgAw;6!jZZlZ1 7m9׍((ygnW, ^L DoMb EbUps8R<} $VN$6$#{7M镏\AIR QCoIJk*"!y44 ؃ #Ǵ1mj} [Ry:mH|49N82Ɓ>,Zw)S_V+mL)FxuJ(>KRCAROåf a]e*ZI2!6Gџs(0-6}_x YYsn4 g,79~{ǟ2M=WN؟$l8|y]M{2΀h='S\Z =2j;ihpynn.!)GJl'm%iyڹӒ<][,_=۬yڢsSRqP ɮN]_1ɮo}+4R[SQ/h!Bxd,]bT&QOjmnvϾ+ǟ["<#%Og)wZblE$"kuQôuQ!gL)8,<b21ݠPQiШjkPޤzm%ɨXJz!+I=NEQGS'=luEX3YG]p|udE~ T_`>/?<kï=X+{ل)pXVʩ4T|<7L}&2T9_S!vAsoj_Ofq`Bl-Wf[H_5u(ss{W[$/}(+Q9P~xmH D9fv4jfZK<ӔUg_MlPT2\o:~ .G3~hn{'>d<Le ~p~;&?`52&=+>I ;1yEφ_M|AՁrLg# PMqň\8 D*~Eҭ|_2@P@ѡy ` 6ǣP|l͆2f eJP>oMI#lΫ_KyGU3Q=Bѷ2Q=^!^W86 f8<6A6pM de8hS}#$@)"H24Q,zE"pnZQƃeB(,fʼ˒ӝӡAf83Q?~pwk>,o!(&P8QrHnyn$l1IJ[.ftȶL.u -m'!Sq1&7I`&4 Kg8Al5^dgE;zIty5tp$%Q)W~1jQghɂDTr*S]u mQ cn miP{LG !/ń#QP yʩcOus *kJ=pI XG)SVp q 8'V,ѓBK(Ƅ jLrlkEiMZcJl8`Z #muaĭzi+DC[%-جZj4k;0$zF5>FRHA _O~]08жpJ1dE!c齧r歔 ) րS!¥3(}G0bs<\jjĴh"cFFzc!$w@t"VH z\_UEc#ԄIbB+Xz#zNh |Y'2_ְwnE6%W-F㬐#zR RL9)AS[ y&zM zX4yK"AdM*Y£+"fø ǔG OG~+R*T~;QʣNq,{GV9<>Vowt0VNofɩ r`#'`sc  05iA'hvO{Ii>fӓ4c+%*]!ıY ľw4]Z 2Q/3:/`|>yx 79 \} gJDañX/T Ec|F:y>_yLZ sgRɨošȇ3̸R?/Cw2U_Bpx5uo 1KtC9RL:L8nX"ElQM(mF[$nlC-@h?rޕƑ#Rb4},;@7{1/30`dנY,I!ۆa[.3`0H0@^6kT&a!U u] ]B*/Mu ($)E Z-1.qI% P +ƹAQiARJPE&d$ [J+ߛ\87 3Ga+Le܀r%047֡.O[]]j.E"  eҎ9oƩQ!x! Wțͻ}ۼ;rLHST"_@q; F].1lnS[B槌AR:^U_w>g)etΠr: wq3f Q~j NRIx7l~֖ ԫ/ FT$e Q@88QKVG t.cTQKs 5'e#G^+ȁ+#횟 2qڏI2Z㽥Ft%Hʸ,FP K&f5(3, Z2, eD"I^By]S"f>ѡiԇy(1%'RK: X |&c:7:m~ ́?zrDlE?ԻaK [,>6a">ʃһEwKa!ٔ).qI=njM~=8Tm~2s!iie*=K|zLH]'^\aZrTJt>??<پM;:?6p8=POueS Zn-!e@!w;Qa ƃxleՇu ЊQ{wM}wN'm9K k'3e(Jҥ.OA:A`::C-: JW7^ zlDnTR-._.(|&?*jL#Azlvl͟ԁCq x%3l`?[CEH5|q880@_ZH%f a[~8Jm"XtƵaf Ec~)Ǭ0ݟd~pFgFΩ0LJBC!E©⊋*JJRɒȫ?|߀xxTC"]lv8gڦ7qJMv yk !#H qn %AS ¾BU𲖔0bjVS,jri^ P(q,i@ |r HvzWҊŐ}'K'P)~ƀ;"B mAL-5'aK:WsLa &ԓU%IZq#sz}is-|2kszE|ߙzhO'c6ҽ^VeX’h]@_.Æ諵w>_XtVm\_1//M9hPš!ڮRd*rҾ1&5f&Gϰ? `*T9[UqCn@ 7:;jټpYH[#n3# TFvjL*nV7b]>,[4ޔ?lrVn`4TΊo DGжK]Rͳ$UUi묶Rfe02 Eγ=; J=U 7`$+v^_ hqv戭Cb; %ژ8:54X:vo-W.r݃/m^ef "x'@oMx^D!EMӘ;.g4]gd{PHo*|.qtf=?|dmqIZG]Yѯ v(xa$xW6{v sJw[o5f +) %U]Jck ctL 0 @KUT>BGε.+nl4 W\Rjc'Pyj sf*jˆp"?qFcs[n-ʻ*1N;7ҥڀ͆r:kS^|EGrKG*?눮VWvyy-)סqM'OH|'Ogg3~>QӧV w\zcOTI2+# mYI8xyҾ]4rD%JS%p߸D&#L}j7rK Z kXKL3vnAXb[9T, -&f\]sFHh#[Se:J飕(+-rMԌUP>:Cl9zBܝ:;,AxgmZos&mWc̽I24J1LGx$s/PWޘJK--iX!JR'N܋[DbG.K)UFyʹJj.i4"/BV PM)QIS #. =EWp@ѐhthJф]L\bPIPePt](I'wHNY@k9p/1zie\h.x:~{/p|+FtsZukJxL1p|h!yQRrJ5G$F:.Q.0Ihĺ40>9 CWX]* bXYy^P\Tx^4|xCdxw$0 8eRzOq8(&MGq>8(&BDc2IޠHm3AQl>c<ϱCܠL_)J ȹB',%?;rPL(I>88qU݈/yB(י܈(z-@F hXXN% TeŪ\*i'˺(-Ne$P{A~ǂd%S\S\BҜR(VBKW DA/h!JYq$etHRf[EEMPߍeu$5-oeO:[H dʛ)_B? l)6Aim)xcygbܣRziix.;u^}4JvfZchipGCnsԸ ^v \?=:*F#k*N[zc=SrSfRjc~i,37(Rtʼnw[,>6OFn;=[ M4ʦD08z_j&%0cUGWdBj]广Ԓ)D/$e"m}Oo_5^ھڳDňQ(1,E)ET92?Uo&PFJαm~jM()(vȴo&MKm40ٺ?*3PlQ}84٪4E/Lc09D#$I'~.$dXI9nr rPH*@N8/Cc1ٗ}F&u uB>s)U:n`$Ӊ}FwpFwKa!F,C| R`B)чIW~I"к57";1DZJ믽iNd-[M#$SGN;s5p$&ҕ!g,(ŅliYzt],rGơ-xh3y7!O2u !_R{^:x{xBb= ee#5wb)|fgzJPZu3|-^s޼?^~ ]qY_쑛 K%R޸mǗ.)m-[%AFUd.tYB {4oosMuq jQ:P㻺IcI(_Ae h^ @ BÓ胯bAƕSA:[G$_=+SFY#lK}&YQ d]뛞 !J+{ ,+,:BΊqa2ץ0u5Pp~ TncֽWD0u$}U սS FzS9a|)VFB`/k?RR4˄{;6%f|}_5+=ߎGz[9v=wɖ_ޯvDZyB* #س0I0cY7HTIe+)~Y }*+Lu @1)QR(NBk NR҉'Vdh`QW.c|$!))$Z $) )b UB ߤ8wiTQ.qJŠ0Qb&T&:Ls+֐PjBHb !d%*Q *ch(00fib QC,dԗ Ɛ %cbew0LҔHΗR-=pƩI+&/+D" 6Y@pVbІsi`ıp͢++SكK$_s B^1 W1X֣z{?t`.B=z7o&ӏ#Zݶ"7Ƿ7t4_,XǣǻWe?3Vm|?3]j2e9<bȇ΀.9|V{>XPh:!C\^֫LT\$HJa@ QA=;v2}-FDp,y(VP87YݍkPe²/rwS.& :#S \t>!ӪS2"BuF)d5ÖѨK]  (nuLmb=XA40HZ \@b1cT&f4Z-#3g-~@-L#@~ZJOӺXzn!b)a/AF|[yS.;~?z|@}5G+K3`2/~Y-g+orr9[` Do*7jO= >r`X_|~=P(9f|M33ڹ@1mgCS;׸o|i~R @lz-29,1R`{ dbD =Q-ZBh%Si !Š_5(BI!L τrP̻9vKnP\k.>ҷ 3OIƼ|`bmbs(]F{#,WO ~g3[/m~>s_xLfYIoFʹIW 2M(/wM@& ;g?1lePk-1IR9mSâXJ-E]MwefQѐJd*G˧ߣ ?=άJ2:!nAk}EP:OOpR7&"76z-Et3QJcwmMV!;ؙ1ZjsA [ 3*h#/^FC#g$!,BD+\vfRjHc1Yˁ0@aGD.xj\jTSXҪIN""4Кs NJ@$)KIHvD3eXBө&HZҊ:$?MZ}R[]QGB"o"E7>&5jjxa= yѽ`'S?a srA6Dz}éFtZ~|Vs$SclklImo#<x!\^-J/F OPYmcPD`FY_^$WK3 $gή1t f8dt sR>073H+;S3?ʛz|A7Nsz+oWކrm$S\rd5p4T;:7O;3JWo(qTt}V|9s.E)8(5~Q_Ԇrm SbZVELA}B 6P|t,'G7QXotzo2\@uh58ȶc\ir7]ZtWB7|f~I_f[՝a5eP-eDy aW9wF3F8s'?0U=/ɻ%/7Y4q~!<6ՠ@夻Ҡ0@Bqc(NtX|ts+ąTD;cr%sI`}8?"YD!+O]qP~.S}w ^CLH_=HYTb)v>/R>~$쁷]H18kܞ&6CJ;,-S6NmhE~x?wJj@vhۻwA {OD#2[=<][.&lj4NYY4@XCM]yx2Oۚy؇ӽYb18y2ed)ńYfiMxA?/|QO_\pe˔Ͽ* laz?XC'eܦck!f[g\ʵg̖GMcg>-77R.>5rG!GRO]$GҌdAN!m]NΜ%A&U2Z-KA hB)e_kB* 4;ȧӊޠmn8% BKK?(cӱߖXR[s]Bfcj`D5R1DqDbTTEf;PYt`Kbdݱ/e^^3-4?7 KwE%0qڝ /]%`:ܘ}iK_*W(vqL)nb%0tNn¿lyl$CD+cƠ(v.X<4nuSm&}-Xsaz=!A d_ _?"|Z LC TC2T_[ڜBAZ cqn@?˂ Yy P_&~[u;6%{hW4u'yE.i"=а `X"~2RWIiD:,9 9BP}yF HVcz``ۚzP;<2 c cgJ RAf UwuG~Jnmmht N1LERgTPidH(K_4D؇!_Y,+9ގruWt>{̾w8pkf'5ԷNBܮޗe|=PRdGkIkKǧߒ/dkIGio@:J|={F3ԀR@!ƩR $  :u/>%5 )]D1THX&Ɖ8>:03wy~ųWFRgl53eH֪mpLrLEbBBb^WƉRG` H->Th+z`ד)$HRk=#?G\wqB;У ,y@syR^P%vENGR7ƮIW3Dh)B@ RNkӔ`Yș613 t 7c8e30X+IE7OT;DČPhZ/CލqjJ~y S cژKTPD93BBbO05 1iӝ/7(c7A1+~&QKہa{d]}TTk^~b2>"͠Aǣ6p.o%$8%bOY%2hsUpA }_l79z VnS=m#* kfFփx/3 w!ŵԃE݀>-Q8xR/Kˏ^NgL ELDᶲ;|ϙ5;.Ǻn[pMY:snRq!2":z]><ߊ|kAC vuκُmm AmUünNgM8T|v]$|h&58jWF7|j~sWomv BtOU<{m<1n߸eHAY Gwr2ΰ lWRl{1s8$BPzƳbC<+gɻ癀b9]QIgN'Tg~ƬFR^C^fՐt D{ v\M+lXj܇WVti&Qb_MJP*kx԰' hY#?‚#ړ1ᠰy)6'g? ,Au9Z yZܔ;R]e\'a{S({dBHOjbx!<8]7:?j 2┎blٶ1biR˻83#\răKx(&GG$ksdh`Bul2PljJ$ZhH,XiHE jO>ooiԥG4/r@^2fK俖ٜ)LydH7t1 Ԟ&t-pp wtxmbHG4(`L%J!4 AҜ(& SH +)LƀHs&NG{dعr~0OOfxɬS->jtw''-o"VBk ORAK2MS#ƈDLRJ]YF+ PD?d+d{_`f45I䈶n5AT֗YYUyp"cSs%uT %߾]2K鷓) kų?/|jk &y8d^Pe- g;$f͆[]/䇥ӭ3ƹi U ':>iLȭs1a 8Z/:/]ɗY4VS$f`XĶ\9#Q \ 3㏵\,ߎk -%8m^s{%8zEX#+NnZG^C(:C5-ŎEG՚`F0Rqx|%ifDͳqkVb{ƸVݜQ uav[ahƵ3l'a6,xI>sjC~wr橸߼ 47W{76ͯ>jws˰<棼Mu{Ag58 w[7ܼʺ&K&dr6x,jeH;,y8_>~An+̠>r7_ ⦸ps}#_{כ?Bws+;PgW3 oBjvV~FWg>&0ŃdZk8y޿ēA/ M/ޘ<.&!z! gB``~a]mzajW[VS 9b`/QL7Q@\;Nq+0'Hk-\+@jEUdeaz6:4bY{eE:̦?}Yd USY@"Z qU0+ ^sL} DZ&j#F1ݎ?{9*@STfZReWkT Hٞn 8\8|W@k,zRL'ab4q26b"Q[VkJKHo7<zݛ.gd =ӌGwlnZGc>}d9b8W 쥖| C7!>+f{LZwԗ[pXݏ+9>)1t>=t}= ׏w0Xޡ;Xo84u6Ζ?j̧0ӁhVSݼ<`"6BL3WXVBPΚuIJL :*[P9}Ab$iHL"iX1qDg;Gx==bwv6$KZ!b1ԙ144ugWtI5W &+[3/ozf6EMeja͞1Q-&-aG FwkyCv3JJ:mŹ›e^V>8"naeܼ|oJY݉J70Lםy{s5sO`pkO=I曭q` d/K~4c.~VO>̞~nמ8|$%/@(Wp+yY8!֬(sNSD )EdMoQ|pYSp1Xa6/0ʔ; ~*e\gdyŸnsuޥJo=)Iު6{:biM0XQÓ88z`ZIbn5`M}?aVXg7vYZbg0[eX f ƩY>v80doYaN`ע!m# ֧OSW3^0#{|! Hc':+"qfuZFӿT)ো1XS΃ܟM :^rα)886U@*\絗*%1P7恘5(x (ibzʂȥ/ upyR׎V$RYC'Oގ>k@*3?46cn@`FwqpI>2:Ңxgv%  uu3)[zb\"qU;ĸ2qL/!R=jo)H.# }P!@:Y ak+Ϗ.,W.Ws{fI%MXaFc90*V0˒ P*ՠx فB8G/~Iut%幾U]c+wgkKǖBJ*eb֥bQh*NBrw^(N [ǽ&c2ʗkkυ1/{|/άŸgaWذ y ϳ¸?{yz}> Be OKyl!gD D3l5φJüVZ\Gu0mg^I&@iج͇˯敧˃F28-7 Gh@QUZ#n  ~g.RREmPksz&C:dYDLu}kL" `pT ZXZ׸YlȢ*Q&&܍4?]YbVjrIYυ#Ew6߆s#te}NCfғ,c>YYgMh%5\9݀>q ڭ*EDe\)Vi-ꐐ\D+ɔЧiK䬌!VaJbqJJJ@'؇[(HX% eա+ďG8Kt~<}yٮd(c&ojfIHv;'FKA?f Lov+*UFH=UQ\ϟ`i0.\@Wg@ |4l7VH'|3$׿Eh SyGjⓁ [1gL<N6>3rd¹^E:kЇ.+ͥj] QZ`M_1i(Nt'xMYvAN1=1A=d>YUVB47m>l(M۫ޘ `@uƑ5{"B}\d'u"cZr)ib61PcbʐQ12`o`9^7+ & @/l&+ sIc"GrR) :Sb' %"T JX3̼-\W@ VtT_~eQ:}2ӧp쟩So:3ցkiAviaYL"dbDXU؊X3E2~]DUR2SHu6a JNBu88 3%N^X;oC=P5ԞVGeOV_e^Ҿ\Zq/ li`2Kpt3 ) ? 4PD-?G޽}{ +~?j."&0t~ w2-<-ЅOq & ξ[XE3Z`q>-2FU"el7JzǙ4N/%;^Dp)<;PCdY^Ӫ]8qE3獂1iq*>?n\ʒI)mqҗ -Gy@_Q()Yؓr`p66<bfL HFk՚Rg-R$ HZ/kҺ ps R aoޖ> f/)+n^]T*Z !Be)tMU1l:dtHMtbDrg|N4v/O5l ipR>ڹocn1S:rkDh˫>: xL1;M=_騵}'{%OaI9Ğ>#maUœB0%_.+#*z^{], _.?k6Ӆvr tY_PT櫿zAT$RkLp}U$j!@OrSeV-4ץ)9I?wD̼*N@c[=($1(nIY~r'y|aXO>9_~Ƈ룸{`V)A9{_-h"3Sc7j.3ʘjgCzUg 'gHBhD-U00wۿAxǎn:ٌM,hTڤTh窬SX#1GsTTε"RY:.N>shJ'BgyKLUsV\ivR@E-{jLepNJF#y"c`}R^UERjYf%JU!3 J/]Y{֓Vy,C:r7 hLP y=A ZuƯ-8:5 r5RVЀ|N2JhHA!\'l soD.UdpPQm %,N`T$I ( +WTU Rz#N91BkȢ_Ii#+ COM?:k@=)EFES6~"Nݖf[^ĵbvN ` -'b[r=Kkg}/uQm:$9H1d 2:ȅ w't\0VJOIvI祒wk%8*Rp^2l6:Ȯ]ByP#Q{ZNsZ.yz< Z/EV+_~jP"n柮ֿ|joђ(E889vO1h`qJzo5/rdżbmdΛKQޯX5CH/nfHR}q)g -;ojA]]֔aс16\%_uX4TӫJ7Rīv W8Wm[g9p&kJfr  V JXK4FZIRc\ez]A)3Y a$-PV 1T,`3NNA<%3˯Vv/򥝡 p'˗E+[i&\H3]L),h1B*KE8Zy/x!.3_̌{Ah%es94*0DЅ*ƢʵC07' &7>mUѻ)2z|+;ӧ`I;{qIa-fR&36Pei%x5*sp|㳃VՋxu %1Y4zM0ςYHr1lzl$Q9G#^NgD\Ů ZL( >sZL/OiNƏ0 yHkrI]!]r[AY z % UIa$I̘eu^En ,:#JPU:~p'蕂m99w8O={4SR(j\=IbVCN%+R+ 5j[bj]HlF#WA4=EeP:EH Rsι0.1.wˊ2hH42l3[XxWYN+C@GUQX8aP?B5"?%e):ֱ-qqVPҴ@ֳL̕ĝt Ld w aJmK:겕}/.pmYr!Zs.t< rL.Kb6B zgh]F8R雀`F`!gUO+Ii8V}{gϿ:7=T=oe9hmA kJp,ॳV onkASΈ\Y`0Ʋ3gxۚ|™ 'XP`3G@B#bM0+_b:@Զ>k4k.y!7*oFwK> K`e؆W Z/+ m/v:$3\vx'ρE8Pkw X"կE|<t%+5gsm9P%oL+5|F@9fGCI(bgRn5aQٓ{lAbV-r RjB/sM(T*&^Zk&^Hno'IcO_^~cxm6ED]d|Ao |ޭEHP-OVW=B;}CQw~H F t~N$E#y CJWB@b(mɞPVF*Rv`ԭV׫۫7xcLcY%ӊq0˓"Ii5"I,ۨo;Wmm=p:Fh*;"{\WG>T;+6ƅ:\;W5 )# Wr_:w`)7W#];njdv.U  0ab휻GwK 75$qz_;%MHwmzw3a"d%)_Nssǂ;@ =S),OrGv 7׿\Ͽ?߮j3_|aҽ2"%mp{?]]n}{qiyl_?fdnY$74\O1ޡ/GC';ײy~nt. E:ͭ"[TM=uY e16Oo~΀V OA"@&Qƒps2V=LP x]7z iF>'rQ٦+?;Smp'RT;ڝ3-H.5NsCV1>ԃ}?xn14MF fj*V6tW)r:X})6U_a쿯 J}.o/Jv^O1ԟ䐝?:Ɔ0w嵿?bof=$䅋hKdLʝM$s&ӿ],H!ڵ}pFy_o^,ۡ߆yT4v!rj c8֖0|Ȏ]A]GNNP'1 Aq WҌg#gRKu$EOK[ݹO$YvEw p_Ao$<8l?r!~Q X.vY]֔PP!$cFYx5r0uOf: ӃH=fhp Qkh#oqH2j퓕vtѹ^ʷ8Jh t&Ă4^;28 Z?:?9c`@c$ iÄt";% { ,2]J3BzIreJe9ﻕx#TP!u*1㿸[&iQ2ĝ];U:w8ϋG] ᤜh,&a 8(eSu`{lDĹGbm#L$8Vvo(?d+=i0(+ /GաՌ˃fw;;xáw-hƏ GgBV5Q >C&=ӻ@@ w;c0+ES z5Ҋi|m!IB # `/:~劎o]Wt<.ĔPx\*?cTs0uX8c`T_0 Y(@s5;X˕z6< 7ہ7FkrQ DϢy7a$8w,^ƞ&{̟N5*Mz (; I(wd(ZЦZ 7]19}uL!ְLCECB^v)| NAu Ft0OiWEj>$䅋h'BǹݜvAt#Es!]#9v~ikvCB^dFʔ]h&\z>VM2r.,"X?{֍J/{92@u2&}J WGk]r쩚 (ƹ"Jk4@k l-]0Gg4%B46F`KR5+ϝqdOpglNtfG|u$x92Kkͻ?G|jS'׎l<)9]'O>l|wP&9xO'קCqߧI)x2^Y:ʈ4LN>1j)L?ႤMLXQOrG#6/g7~0I>-!Vl4&h1ZrIF1#4Z$rT a/LȜ6rΌF!pA{&k>)eΘ5z R'LggC6mRa~bXK3L^F#DZ_pL((Yń>칖w5ܨ$?) r"-Yv>r@,ei>3Ƚ;^hy@1&m ҹ"9pG)q-CvJ&jԖN*&HF$&M& J=BR 9]Z:Eqݫ"ZwcCn/H)]2HioKn]+jﰌ`ڷ=R8@b;+'UxIn_׈2ȽF%d ֝=#$`21 -=]c[FS TA)}8-=Cu86]9J.9ds.-e鯃 ͚!E1ƦВCE.AFg 6ADDnb^Oƹt-ȚoDإU!Z&IbU@v#ԭՃ /Z2հp:LglPo @bp{Rff-"j&#ph1&jTrdAXH:3Utľj-iIq?()eq㱪XҦ ;<4CWxc.gÔF?*&T֔YrڛR^eUi$~V4)GA҂G@:< G3FeV!G KΤ VuM/ғJj`:MR/KJDSL[mHI:s!d|Z3 Ʀ)sʋhd| D!cmvLR|n5=בzH3d.dǢo3Y 2T07^Kp1Č"{z}a'o/-ç2B9{x?_5,.Wg rOqvHon\\`}5y}*񦽤 N~.n @~ݿ];%A *c_6x p&h?=tw^ĩ gB ėg5J,lkzWQlmLsbpP?T҅yWT `i*PK6UlY>KGYu~1dzP~S48TpW~ R(cIn9w.5cs^]943r+G t7.ڒ-2ݙ /j&[SːUdJ`hQHI3w}t"HM$n&g6U蔻q`wTOU\KmK|zSKJƕZYģ.:O&Ɠ=.ӗR8=5jUN=rR.1MzAժW]~ďUf^lr%ak&y> c:Ư9`kl[*+-KRE]HB6JHdn .vw5[#Tu靣Zj: XyHCvC[ 9x[m-Oǯ1NJ |Bk:lܶj%[P[M_nY.!y;Wm~o.+?xw8!_n_]^W~UR:_-B022heT^I”SY 2<2K2gkQlgUx,*UEa3mߓ{j[:]oZ]/g2 M EP3?>,dCgםMZ:AI׫̎UX $u>A XM:=54 L26_!)NGA{cc6q:ƌ뾑C\A1fI( ٻix6RMYߑB8_~l=eyiViu3@m,6.[m;ffdv) _NtaVm]v4Fې{GǛ[!;l uˊHSTb1yCqVUa o_n`P 7Э*CF\ﯡ߻Ov^_̕Θ\-u:ὃ+8Q;h1"pZrFvlcF8%Txr,n}ǩp'X Zz9;!XR0w6;hav!Z<ܨf=F@ NAYx])LRgeĨ21Ƚ"<+JYP^&ϟI- cK25'i8Ӹ4F69IEa2$iQx q@(u1n^Kdn%fZءx K 3ǃEph)-}Qӈ [ū ,_>LsI.שgf]R#E?HSvTCh;Ye.g7+!M(u^*T[Rt&a(e{i] g7C6Axi] Ԗ1^l*[$n|Yjk;z RjZ+y9ݒR/Kȏy RSъΰQgt eǟOt401 ;w7m'5֌gw:K9z~]^C$VJ=3̣U$ 1DF,Aˤ>k"Uj &O@^p~əncd԰5_ km}RY\dli)0]m @<*d "5ʡi`ur3{AVtT8PyVF@ -XYrM9$xD6+_1Fˑ,q rp y4XoV #(_g+kXcT#!2׼a cwA;ٓ艻l s _P c(t-a֍$: JY׼a\)5P`[oH#{Ol_QSq+:YV)dĶu^EjzKq%n{qR< cX՜+::wꊃ+͜zg{>Ds=>ó=JNM߁O/Xp nõs1F1w ŻmL8aRp.x'tdpc]]8u;[I wOVj-G0iZ8uܜqhzvt'!`i@T+v2G&̘l:';@j R:0șƒ2*$G<,BRQ1H`XڨiMbA,roqY%ɐp!9JI:0-Iz+IfpZ9~R'$ssFĭp{9cwhrޫhH-_;ZV2>=XAʵ_ȖV g ZD,}F4j! )hv,i P@k#MRL!YJ A=挅I*kiC@C~) G0`IYȞY)" ǢU)tq 485QלRH[zf?ߗƎǗBtF[0X+z<e% G:ԮbiJQ^ɶCKۙNҘ\$;Z}rꚌ٩dZRƜK&GʻW.C/__H}IUV}h+ǧgN^.ƢREhGDw_o3]n9_X `z~;; lVIb}#d;URIJV)%!@Ō  ^Ή n3 >M+WzH.XtO#pK5s螲=Y~A4f^12w1'(J+#z_#a=S]]ŷlzriTV|o O2g6)d Q l{d!4ےɶeg (`] {1n^ڀy])Fϰdvtq9־=U =Rt2Y>\iar u7yn|oOv  ;w \#|ip9F$${,a4L"\lo,1bE'BN%=U;wLGц~Y2AD;Rf8]0^9PwJ,8ѬuH]t(.uNV,p>5-O}Azq`1WBym5QG,رa"|ɥH-4Qy($rJ,1w+7@DiaP@dyTS)7N"i39nfa%JS_W?|/zowQ4C vi^8W,Lg9e ;^3v!r(IN WŽ GtW9qca,J$wH;&N1kuR{{_#ʽG^9#&w]U.ҹˏ#'}l?كQK!>;]h4}:KX𠞴nq6~oǮ^==~3`N 'o'؛ӫ$?xQa= z%͡hޖ_'XVuF4l*ݾ 4;M32&ZPiGjd2A5l[0Urxy-|D0ھ;MK+Ю Zɍa0ojQ]:p>zoG0 W67ϵƕM.xk'4ZE6 5Jd0=N,_䚸.DfC`fk7Gq/mjvZ"y%4̥}&;5@WMq:m`|?~?|udjZƂ>mF $d`&}~&Cڦ ܒad'&"'mLxyj`^4(+܅h:\JGXrCz9Z&b:zP uʙ7mf*z͝`Mɚ>]Hܻ5=FPagQ%\º$IJyw*xvU&G&zL `MJ:d>+V{.Gt)` scJ̗f"pPhtɹ6+.(Q[Lb>$Ow}6Qik ~}GjT'M<{R:$Ud$PTdB$ cUP Ɇ'[mfpIB"5t\&ˮz@ ̅>xCNZ{k q6̐ڗ\ Ɍ T7TJޒEh }T^EVV^;C)4ZXH0XGDlW+͞RD6jobfD_1 tAUq*4fH Bd5pg@ $S*A~%'6(Y]cTOi%.re*/π&/ELEJrd19,M(=I&!laR'LbHMYVSEṯۣtԠWuއ'0@P<_Gk(mKдkQAP_ )t[3$Z!k$R,I2G"{X)y4jVdY`Nw^7|k/n/uȇv]اok7[־ڔ;$1!̒=\۷{kRxt0+G['fzʽ~-5[gO{_zTƬԐ)?LڰLd x )˫'J+9Ì jк%RI?&@9m!ykB $!IVBcm foM & [N7V>R CN BgE 2#)WM ; 7TAS:YS:ЬOYGEAXgU•r9ɪ3mJl"^ }AqQkDG{s+-vz~U8KW( (4ˏx/= ~Kq?YR0Gh~Orf3)]~;|G@,udqF6CKl;O7j@K0ބϏ]<{3rh|EV~2QE ľjU7[x' iӁw0bƃ;=#^O c+eZ~n3`Sl&&w$)(6tCV1wTp)5s7Ha7F'lfV"jF@XEnbJ%J[@;W->^ȥm3{1:+.adB+&UO6<쎀{v|ٶޣO.'?\.s/< *&g=R~jA%coxQ4)סk4nUQfm[΍WUT^P |J05ڑM]o9WymEmطv`Ivrf_Q,ɤj{6b_U(=bNh3ZoB|K34x:GN*6R;>s" 1?S=VM9ѴC}sUC1 Ѥ~=N0YŊ}~qOMnÛkI {8{8\ñTzUUmxIJm҈P8)\pQطí+U!G`1ʶ/\dO ىba"oIH4_$eۗin_}ٟUkL+蚀S-^3uh**JՂIgݲ}'D}5~ '3^=UCC_;qI2ySJn9ݿkwwxU =|%?]EnWSMʈiz,s^ʚdG%̩B^=CdG<ƙ*l%f(r.w||yC/ߚZ "8QUUR%z29bpg&!b&֮N. f$ ^tU88<JdJ,dBȷخj;`0* >kBpn*QY7D4-ᅤg 2Zstf`?.u$ugu@KL^NNߟAI_cM":NA#Ǜ$C)jR&~dl SZqJ?<W!q^Ժ+bW^.z0'w+B*5r^_C;g^@z8v:&%TB&4mEdW q-RZ-@9"t`XECs?@5g T`*]yrSHg]$WM[l/01p5yL3FnK %NCd#" ̎R'Co :?6',vћ˳em˳<\Q YQ&VrCPөvlT`v*snoiyKL'֖eW# +IrS:j4d}ANp2m,zY:K}[~+ j%Rk8|pA5!k䆏}m]CW8j%4èT[ lCS+Gn!Td{gⰤT A"o̩U"" bN|Ӭ R1^&Jgt+Uek eP$X *0z[l;ȴvqԺ"d^szhxݒqlm.65!E)f׾aIؼSR1ryS:4ؚzQ=k~:]Rw1(-g ={l5{z_?a*2FL>K%a,yb S &患>/]FVW5-dR^28FLi0Bt*D UA[41qմߋC@#3̧ɪ%D@uu % ɃdPmdRpKBGEY}z-5QJT:m3` $pymH&ʇv >btq ʂ^YR@l sA28tdU^ [-$QZт"t\9.Җ@GZ8dtB4FɎ u4BV[:ۢWر!YPmeየvt | ȬHCi (bfx!"MҩXφ ASq*c2Vmr"6{*? ykaTi t)SGEiGv$<# <edKSk22zK"Y9ϸ3ak‡kЀU4Nh#keDs}xJKE}®ӿ.7Y?Z%fq;͹08YQe-*j3zꍤCP<(Zvb y; xҳEwF2".` dŬ+6zua{jeidh7C޳YO+i2v4G3Lf|2eߌFW_i=Y ]'bh-:MDQ2$3ZK/YW*yźi\wu6W.EJ] C \Wo#3OU0[V<=º7Oas\|dHbd 6ji̹nXbB;@I `Z-ID;!uYIn ݀/BT"ǎ._uqcά*@QXKpe^\Jh<;#׉FpkscixFT%Re%]r sci@{!𱬙]a$ +޶m^2gJj*(zq6E&_"53FKeCdυ *FD 6?+[?:x!fO?UJù 5@d%5 3䇀ȦS (#@di.DOR[&g br]y4d}ݤh‘fooF||{'EªQdbNe2DOy~5bRT>Hf˄"om GhLڣcwBЌv+(*h)ƽ[~I䖦Ǫ>:WЄ_n*0iP`iY шbfgΠYJ i7t]tR10!hԖ(45-:feK*x0)ɟcjKuaC%s޳b8f'l0ح$-w51|\rIz+Kҵ=nO 5"<ȸ< ze'E%cn*CՔ[(~%zWѳ !gkQխ˔-1ƁǞcLmaOA6驳d'J咙k;+c"f߉1w wA ɁQE <9YKK6,vQ;*zmXyTz᫖N/:tgLXe>uZ1KsBhKqodars}d+a0)5D͑c;JבvPu*`<&ﰑ i{ ·Z^oY}]0~N Ɂ\kPoP]qǠV*4@dXht(^\;=PR\mꫡ\o(&Yi3Q1Dr㞷(߭ m l g)r>t 9JZ#Q%Z'[g(1?y?ED𧬇/*V•逥;l`ɕ*2)_N9/ {u'R /=t 򩵅a6J@RZVGHN/S8(jɄ9j}K(4u:bAZjM[ZBG2oi]uo#"c NCedڨTEZL Lϔ#KT +8k4Rw Q4%)`R3A?{׶GdEbeF.Y vh`0/3h0LKmRKv*KHd2/%K3D0$)RmZdJ;(ZgK&!jimd6֖Q.#fR&` nH342xrth@j.Y_r] `[r )CPYj-YKӴ5K֦^K{bDcCA1%akr׶$АNhlI-MŴ ݬg\oOYrRi,}ZD/6fgR( 4ʰPrv@a4(&v܂:MVuMS+純M-|z) 9)8C)4bk.p M^矗!mXljS'a||xdD=^gtq{ۢvߜ-WoRZ0۳.gq1 GKuU5w?סeeIcuVe] Ⱥs5+|Powo056Eޖ79^,Yd7S-H MÆgcayu:4gGo_\<~[V ,\H Pr̬+Wew4'{Vy?(yK\+8„{Gx!F%u§׊24q[jJ8'P!HNoR;_.| OMO@棂b2xtSt߅;F+yu=VDHJc-Ƴ|k1sqv%R O SlͷuZ>;T2>*c&1S.Q fxbnugjF+)AbDqYCgѧp:l}*,*u+ O (|w]LfyZE꒲1y+5&t`qZogOW4OF5:q"pqG41B ɚgaBN??{I6t/I{cd/w?g-VtYv2eɭ_A(,8Gaڥ&H4C&W9 3Q $iШ r|Ϯwבbeȇ:ھQo5p?kYRE>eǔ뽳<䅻hO sNt"b:ȉn^dSdѭ21 !/E|Srnt[))S6,BѭIbtC^nMm rۥ:?m^n;q n/{<&mIKd=eVµk,^m' O/Y$Cs(i>J1lx3M(C ;wͭ Mv}{0鯀a U0 -aAIzR yU-eq0LWPvLָGh674J0A8|trJ]e\:dxz'LM33N\s$r/&$-{YID߲^dp ˌJ;[QȽh >5 %ZCJ>㉶1(=D' c"9Hy7mynZFx]M2T))uVЧ!ˆh5o7xPp<nsٶ2$*3.% ~Xcj Fa-"2z{Ͻl -i/R &s nbwPcoխP[kZ: 4i,ft¨'dXaDV |}XVX!ZPkI#.ϾuMUL '[ -EƴЖ`iS{ pDPLb cM**5$ڌR]!6ǕP ʤ{PA5`wڳ4{`8gW@^wצʼ118uV I3dI.S fNȦffL|oSNLLu", bz6}_{z9\^Ńc\^\0Hu( 7PbhʖVJ`LF6B΁c"LpZ阄KgJ6AIOa-ɫ4IfZ?=1WE 5騑|&>€UAm @s_z1@Pʞ\%v/>d ~fI`I]]_j@w;R ΌK1H95[hS9)Ox{y.ZSHt71b:ȉn3fducG6p-)3FscnN;rۜx:ʱ6pĺ9\Jfl΋;F"~`;!d.%eZ6*#S&ވ:w!J뫏wp  pusAV8>ҍٶS/Ro@"C0\% 0yE_띻430hx b+}Ô k ):Ijʴp;q7*SAOJx/.nW"i3vDW킁DO 05?)`D [=\}A)H>UP#ѝpN7uÜl"SB0[oZF8z :@5 >Xh(1яck=p"x׶0)%f HԂ4`X/ĥY(iy'^QjW‹8.kA928AXCM`(.飜 i fI#Ee6?(?զF[sm>7zNʽWGQ9ת&5ctkuMR1#gL:߶֛޸ƙig놫5hF5mkjݟ-j N率XpE'brVqt.ySެ Js7?mLV)y5mCpNI.#,S1XW ֛5JYZ#\IB_܁aą">DyE[۾s"Ş'm1y"|a(P rb!mШhQnX5cu }cEַt+"F0¹zc{cBww#\ h;e/-3*8Zk%N癫k a)ĹZOU:wPfAx.,ky 6HBiNI$~TՊQ&M& S 4}Mn04,&~2r=@CV!զįbJ=mPМ ;X(ygRM܁Kd)JZG:Y!8 ~·ieBbk-5B2&a5=m<a.Mѐm$#11+=Bh KgN6 IOg-tv2ɭ_A,jQxdd:+ge?O&W9&^Ad(T h XD9$5NENEF|;/ag8qs{"C7 Kby4=o(./5N`ZKG9q1GiǝWRnjz#wvBx"G>vy;;p-)cΊn_J1HqwD mɢ[y-ZbtC^RǑ!D7(nD) YwLg?ڴ5uw !/E{P~:Yٯ_ys=kP}/P7׏"-0ȢUH}ܶ;L{*q`{?駍FYgꭖ Ŝ@7Kze [ O>\Ug"3(Ha31 ؐ/q˼ˇ/%J CǓ*L@o_J 18tm@cd h"$-H࣌7}\pZW le/*iY`?C,.IM#L[yB%+r5h"c'0FV-M_TB52`*}Jd<|J C(o%_ ޒy+Z^p}\P^1!ЋzV)Oss+XQYdŘbjȑ_QKr#F9T.|ؼJ:^{CZR$yXb.5OX>BaȘ$/3s |P^0JPA1C•X푗3L/'طt Qpސj8z\> }( .(oJ,"$$u4zt#FsZ'< ۳!,T4KH) 4K9O8TΕ~Q6{]OK|Sod^=0!lG>U8$ab9Ai3X$찜i cJ*Z#μCgצ^S{KKP: IyöHFjWߥ +!\%/|_1,Ҩ;nΞ~w]ŦYPj{v]OY}$60ByCݮr:_.crNٰ]ywcuG+"f=luJl/7p-西(Zhc|`?O7E6pkg):JAQ . V;v`icZGaV񀭊5(nP%x7.ۓy{IWӎ^)9+fd| ^HDx4˦ѡֈ3wyi"/_!sE5ݻ`t=CK$34̬A^Huвkgh9o=X7/w*T^*ŝRX12Qpg !:\(:S "hRhZJdc`H~^rNU!8i#) .,@:.Rkܩl"u0%zKq_yRy!BW$-4~e)(Zcx Ge`\Hj928 E;jH1M(L)v%_1!עT%RK[5W0->[)=*p9r/)K "\SԨvR2,A%8i y}x^`0)44kʒb_j{i6V:{$@ Q#h\%J-J,/} utPP d:H ^NHH!$O4)pBH:jY#SfK_8JGhڪZ $LCd閂RH!F.$7(PsC%GʼHktD,#+#ͯ`i62eT5Z{?MDvщVRnp5L(jCi%Z@y5_ sV|\zfQ :+7ّw|ݻ?j}{wǁK1J>X/%OgXAT{+E1y}J:ޮ{_Dֵk&!P)V{I/`, (3(t:hEY(T[ £jrEsfDT$2lhC?hld<@՚n2mG:FRNEv[Z&J!m,( ^, "V25lb kh.Hq~a-O;׹ՅUְp>^0',h@+ +˃VrpF*pAPK5Z YG{衜2YKZg:4ɜ2{ ~ @J *wg~˧1:+ڊڊF_j+zhE#K=ЌB*~U#[O~hk/uP iQipyk3Q]*$>5m+71PKT5…bvG  _0g(*FRvK).} ,)Y9#G/(VnՇf,^^I;L)n.%*C=isQF)Df,fC(tvt.RD|GyrURF̊v=Rr6[_W9}] ۇ[ҭ>v,άO]Y^Olۗ=ZaGNq^t|}k=0dWN "VsKloGzkS4y4m;fLѮp0N+/NEYSI]#e[/`Yy{t[LN]s0QN@ftv<)N6ӣzX^~!68!;OxS]) -؂COF8S3l8BC/n,Tc7ZN wD lh%i%;g1O]sR瓑L8)ȲKݏ$Xּ-o87EBRE'P:+"P(V`$Pԝ/gR#oHJ ۬EvEHȿ7ag虥~zv(?r  xbM%K#:f>m W=RO\W[yQ^3Z+36vSQ$ش]wgdn(ʦP^ v:\dhH*Ou Kf+f~Jf^96&j&߫t`ZD΋sgP XoK%,tG*cFOSbGajfϚzQ2r"җpd/seWp]ebSP\qv^&\$3s^&&vrӗ Z{a^&QTx(ectg\/ڤ)ʕ.ZVyNötLH/b£ 6hLְwfΗ:0+LڻZϙDk6PJF5r L e 291j0%3 [ʂbઋq5+̻}7ucy7i@YMyE<_Sbn(h+BȣHB(X=Ne*hUn/:@efWU(&XiЭrw^ɛ*eXރbH۫kw^VWj.mr+Gks,~t]L._"`1)K&hZ5:+̜Ƈ?x{Lc7ޟ^}/_zoL}n&(f" մ0i( 0(-5jv}qtKAa􄥡A 0gh ˧XS&8FMqMB̆V3VCֺLn&Ҩ=Cg'n-5gJn43V{ \‘r6_>Os~y%B{W3Ju^jLu~ܗX)@U""6M#[M&ey?gOHӇ?ޭQUbbq[qB)UW$ ÀcW ]}zxqW՗(%ZU5p!_nQe O;q@itrW.s&;'y.ɧj:S?nvѭ r\;D3f_r֝53!oE;8G7|ѭ \;E3"0>^t z7}J6_Z4ɝ.ysU:;jXu֟?\nҖBkEo̤HYz9HӁHǂdKa8OZ粻ꞚlHmuº|ϜG-8t{}Ypf5݅tKQdrRD+}38୕w+UN$D{^2UW^Β'O.C ItvFD^Y`%-|4Xl`SpU|\i36H;#zUYo@i.2PrFtq;U2{ޅ]]}UT]xUBьLM-J끕k0b,Be#׃q鲑)eYnB[62'f^]Rv:8/^2=QZ4P/:s(A~>˪+28kW_WnztVٚ:w_\e{c;T Iqgt~9=meQΘqIVv $#A[A1hw}0 !%$0'$`G&q󀃹x0 : sy#ٕQU>z6FI-#=;#oX *&b{0f4ݴ`ٖ(LRH4ƥBOyי \n-`-vؘV݅kwSpkqj>~UaADYR _ae< ?귈U>7 r]pV{{h{%El }]mFՁ-c )XQAsz\j1L9Z-@V?%. f麇za$9ѓ(*r[0z}p 5m\SOi2X|btu}۴WוaW7䛪kqōBeUL ᪊W[mUq*ulBkӆL7tmmĖxYh_/xCM<;V_80t9;\'dq>a| Ab}?$6w8>7 MR3;Ea@zb8GWd2z ]g;w#e`iό*3QI ʐoKj 7XprkPeQb4bTnWx%2!8Bzxa"0C2jӫw݌,ri `]mo=rc'7&.i!\V=# ]oRyr_u]hpH[pua\{x[_yP aQQqپ=>%Uqӧݗo*?cmsw)̏qOIMq4.s cR4TA4D,Th{07Цڋ&@O5?u;'r~iϝ }RUxWPr5kS8:W|uK(L`f-=>UP §+?Qo-`=ؾp맩py1E:r(@!-]}*Oߞ1O]q$mbK1oCކg!չ0ᄫ vKB;8aKCﰴ ǐ 5ӻ a}zbXc[ wumWX{G6V " %~ q$߾辿I bn})J{eUU:i DzPG%QG쵩Fl}mU*!.%|9ފ{>39~H`ǩ3~1o1{Gg>&%vvc?bxi ΧCx4:Ϻx1:Gd? rÉ9έqmlj@JVj=JһnlUm̑P9|vFGbFu*8c+6qLm\$4h[Lj[GnVLӝ4p\!l4q7U&˴>9DR9v eS7^Ϳהg^Wh Yg>a[CIw!v^vUEM u$:N4߼~K0˰[0Sw#-^'%So pC17T>%^\ ͔ςda$oV#z'A aDos.VbDoޢH2)l&wc>6v qYp'nڒQ#!S~8{ڔۙ"&с>Ѐd#01xo˿ ~o߻>矆ОO/;Omdڮ'n u=7U# RTbm nzkXy>wǚizn8;4!LƯ/|C7衆v'7p^Ѵ~zH3iHszqH)GX\mkLtk1 [ob_O}CC:SynI]=1:Ƨ!D z26A:{ 8od{K&Z2v†@$.b'{,/dZފ8w}~idc{9KǴoù`Vz׭ԡ;=[RVzZ.Jﱕ׵Rޞ#JO  XۜrQ.lbXTSQŭCט'bD+vb7nu9h vϋ 9$)AIhI^yR|L*ǧtգaь,*!XIBT:끕IP' `}ILX8h[Yeere^ f)y^49Eb^mLsQY]&}QF`8d7h+ԠhAl89>6zj+BGe+nWV[)]t]G"URfC(N AZI4J&&aMr'^!Feə"Ttm!`w)= }*Z'r>#* 44v&'8'}۩d{pWBD*TͯCEo'v0Y -PB[Zd! x%-aza@%x˓GCq::[u 9\ZcL&CC!ۜ)Ɔ/ݷa9Skjhӏ]f>n(}An.h-th@zluk;i.wېzaDd3*muKsV?\X 6lǻ&8յR{Kdq{M_Uи>@ھnmM&j|S5;)aCK ({]ckGK պ$d#QI :C쎪lI EQ'Kr2Ny@>Y+/3?xaLa )\5=IRN af1.d13W<-qq œ`ɛjt+hI<Đ6J}xI#?~ Ɩz4郃||jKv4hI[MU4FCN!՚,Do]ƥ):hzfYF+Dm-u%6tk(]uF]j218e{G~[b8r\<ۻ9JI{wzOErэY˒_|~v6'8q/HS}9aI4.%88d27|; w>5D&*fP9v0֛Ssu,e0Se|f Rr+]u$.@ha0SC8|Z[$%\ H1jԯU=({&0UGrd#yåU~|Z)6(KS,5'o`^Lrk/VPdK0Z+KQk4y.B߆:%|h 32zᇃʒv(L :ЛH=ɦѴ] 7W&Pۦkkh7mV|oouG>ܙʈޱ`p/`m*TsM$}շBՆHo;}9 !,nN5݋7"9NtvWX':a5,X{!g*ثwULY]HS=,S6K~=R֭&^f '" S@W[r&!A|QW>+5i@LsF@{ ~(x^5^,8lms<I=M~,)t `r˗Z_}?9|sэ8yЉGf9(;]w8kM.Q21ȏ֡KfZˉTdɅ[2 aw'J`'ڍmHgaoRW}W` C_]h+uޕ8"J cX`3/6yQj5Y_TGJL2Rڀ= $/ SҠ@w&6kuCY}1}Ĵ~w?}:Ag+W&rx'1:4j' 2_+ Q9Ml!4ca]rl7v^."j8cn1*AZ9 L@ ]텮 @cۡ;rÖcNj BQ_ݩ`ܨlP( ^G`n Zec^;Ԋ餯|T! v*DqiI݅F(H:N lyue{7{~Xlbz;$=NؓA J Ñ=V.U:#{ 1L_\~"+&Աľ/o_h䵫m"tre&ֺڌkP{:<.ʴi\7#0E,_FXpơ;J=B`s;F9C_9S91^[1ھ Twv{ ;}\ѥKmf~W?>fЙb ttܐtJܑN:<*AR0yPq1s޴ b|i= L/v_zzgP4iMt &qO<ӭipSG\w-Ab.YA t-Kj B=&,}3J) LsW9[8]kvPp`@c8Pcw"N@[よL 9^)O<ʒ)B{zN fEtC"![tIp!Ikfy:0mp*OlDrkX~cxOt{f)͸wWEy^W~ퟟMKc_ҘI?%J̔!EIcYVf*ZPE2)YQʪBg%Y,}٩em:en fv7xJ]ۯYs꛳GWl֗ewp@j&Bjd.KoѾj}Ç@LJ^ ajiV֥a+! GF0$IVCiMAvVƭvi5nmFk޸ލ[W4IIj*0 E̵q}Zx)#ڸum`zg B3z5n}2zw( mS rGAMKݢ!/&05r8F ܬDcu7b[j4u?q+ul,]=zAkZa=;E|X?^S!{ttXwq (Li:E{PSV1E")z>n>0R"T|q3~'n9)}}V/Kb?*kݗ=_{ 9O &N_Fjh ;]Ob%K.:Ns!@~:|5J""w7ѱ<$Ldtclæ )uǺYΖ70T=^Տ_"%^9(rLʡ0VTrHr!(Dh78vﭝ2vOA5eo TOFjP-ٍYR7 )N5*[3踅$WLkZh.K&(ԪyR6~Y o!_WD.W${-*=>{6&)oO?}tYe#쏢;$Ŷv.o_Zt gu WF Of9Ο\Wp׌uufjW>5Ycǿ|]~]6 %nlvxXFV*؁++x.]v$0pB}UODr~߸"wVo )G~ٷY}{s=ɛu$؊neUCE]r~ 89߮e3Avg{-.juLv{h_:.>RM3kc4C|GDkΕ7 `Q]kn\`8"l,_ku{Y1ooOt>S1KPFlNys]. r5JT*Zra'sB{fHh}NX 28#T%8~CN:ƚazssP\P3ѻ='H0#8P8N`['d @]瞀vkf8y*N; DFzhŏ46M"g+3\~H77C\٥RA*SauEH)ҼxK*JC.J!2ųL(d9弨՛VJS(fPÕ0e.1%2*ϴ0 x] Af%dEYg5JVa7Y^lUZ4X$Itb9$ŶMXNj2ۧӜݴso'?رo3pϱݤ…`@ DyL@΢uaws=^*W:aK)^KsP$FtINe&"Te֙s-kD+ZI| bp^Z5u0ֵ3IӁ2&DRxYZ(eL<,CSkFF/ww4 zz qdTKdbV'{a8VDԩbAl{Z(G33TmCE"s=!/ 4hmg~] U/T+5[emCVMfG+Q c, 7Ʈ~\/&OW}D5r՞bNvOnFO^eFW0q~ކc& ,UgyYS\r9de>7ȴ *+D> /ն.:\3^6,A'6o1Jdpvm)g]gܠs)d;15N`$%xގgDǺǨ6$BNIP <qPa79`T*& 8\A$)5nMqLxbĎ.ư!n $\r De@A3L`x }Lr 7ɽ9bNT!<+YVV׼`L 2*r,r2TTPUkX-,om-HkP'1 JȼuQLgf0RB 5v ~>괂n \|Yr5^ &}Cf<==+[A__~x~wb!JejZ)w7|[} 3>o#f0?>\]].]C_Ϯ5>@+ROG Ažyo!В͓ajƵud-+aKv_;>͌y)lkht 9,;h fFDVzHmIm-JpR"RVV*JWRSX9Y)Z)6V-1JWRӎVvpo[*iJYZj&+=e+ӎD@G9JQRH(;\f'M!z^!࢜DNSJGR#%q`LR$b*)':AџINU<3uxc\OD9S: LsANڄf1]rBE1 0ց9nC!v)r<,![]\0m+g,WoNCAucL\CLCj-ێ !cI{!Xtza0 H;6 SJhBPh]zw#0@:D] fS\]kYrxoF@<ڞkeGz!Ezg':4X7ov-Ɛ0%o=`iţNWvb8*(/nkSxCB :y^=( Fu?3NÕ uҵ23G, sfN)Q֌cQmetWa;o-~Zx9/ǃ>0{}~̰DHGtΘkÛ1v Z~oDޕ2~qo7@< gC[aFFoDLa} 6Y` 58R^Dl:lS" KHjf`y81O!Ur $lX{m(0yp!?Ll".XHrn2E;# BLvАd!9 4#3vVrJ,s.=*Se^ZQh D{$w:$:˫\ӭwL[U%2)L* CX DQj0Fi2DY)2ۺM\4Ղ |XmOX8U/: pҕ.T:rȦVRxPAvmlN(9lP0T ,͞&T-RF0eQYXM)VX%`2TRY'fWc28Ӛ> I dgo!ƀV. Bq@[kؙjgSnV&B 0LId/FK,TʾSDIl.;9 \z}o N$I1M:њQx. p)jCݱЌ1O d<&W1\&EGBs)Ijwٝ"R.)J'd'U G q&_?=v|vzؕl- dda ѡv?ݴ'3-6PO`tSbycXw?t9A^ C%G]wS&~̰,> < V7d?4տBJ'|xo]_W'qHZL%8#HP"#fh3ĕCP-u9lܤd;˛I/l˼/H ̼lK+Ed#gKduH>ZU.v_Q(Be:EWGRev/K1>mwTy 夋8(b7C)&7fIvokM%8%?U„PVXv-Lưq%;W}?[iNR_dz";;]A|Ody?~w;P%]ۥHJ A FJ\3H`SVR[ cjWBρTJ(AU#@UL_I?1ݵDq%-%ZۢR eZ$U(WKf0r~tߏQNj.YhJ]O"bDQB-їwM1*^4]3u#D-}_V 擻Ҹ|U5xV_-{FgX 2B+jhsqx|%bo~h'OSpLP(dUʂc-FZ[wYƭq@U9!(wOG?QnӸ&.;/w>۰$&=q6X ޏu:*0@n+yBI^y8 xhH+`Bю1 e4sM5%JLT@BO& VLj&+pSBWF:5sI`o`UZ(GwH.{n+]iLZiw K0”qR!ISjATY%`iC@DFidzig{OSw1͞VZzYlv}?_^ 4wn /o`ǰV rŏOx}\a^ PfQcn?7m;l(ܳCd *"DqL+?b "aa=9-37e wo.|!@1@} $Ѩ )ôG7+ъPYڲRbd@c%C%)j3[; oGbw,srbX"%{w`D9 İl~XJI8t~bmNJa$S`Ln'G^4 ȨuS]S*ؘhwBwؽa ~=e(S_g=LO|$Mr1/ߕWn]y7گ3ZQ`lAU2+<4rw/:2OԅWK#(O@d]ײrqmuyO!?f|R,q(~6u_Ƴ4oѳ_JWWٓBj%o"r1)  ([A@B˫\v^HSġOg-z2BXwOY5H%7 F{k*yb٪$n^*x_Gժdn%ilQ)C>Y-F+UsB[kѫiK۟ \ *,ƚbǍz7hfHaU }8% i`cDUi^ȑ!4P/;^ N5ҕːZ5?E5AX潒=t-;O?N,,m g6VRA֮l3w=M.^./ebyg˫r5 EUWgo(;:Gfֳ*~9ޒӆF䩅f7l ׇY4䧢W+,Jˉ(9^mJޒ}^CD9(@$%0n̍͜NmI \)oC\ztbhL8n\ JU0np! C9#(C/vcvTP1fѲ Vq#@+ؐJ[\"he6di4^o3ݡl0ȂV."-/bYP2^)J$n"K&5t"Y+ƈGWMU2*ȸ[}tC]˿mI#Ť^Lٌl2,~Lqlm:%:C+b*$).y+Ü&# `̬6TrJ àJ*'(JEf ;f b-wsn> rP4~җw+X(vg0Z|TA|3@g凹86wCr+;XQ"Zl sM$P}voIJJa^]7UBR7 #y :մma;~;?gÜ BOE-]1.^.4qkWؿ7Z9y|O7~{gv>\_]a"FMWPĆB's3X۶݋fa߬*z(//sK zj)ثv;!{Z?SX`KY_h&*RuIR گy祰?_\.샚&p݇o8W-3zW|5wX_#cG  uKӀ>,CPǬsʽ%.9x;\\|_N* H K+\JAQ" 휱\Z]l{6xaSvOas# yq-;EѶqo6E B}{3Ij4ęhcڟi2繬%nW=zm8:p -To02#/ٚDbǚmJa? ʶU'Ts0- I8߈\J"ZvS!`m'Ȱ;Q< YyniiqQJ_xb!9*XrD+S%F[lA+a T i'*4R\:˜&aUQ[ %@ٚQѱo)n#&{ygCmsj"Bu^2֮Z[.E]*vP$€hmҶ~Uﴭ?-WAWTY!i-F+EeU}NE)&SeRS$0opXjBU L0* 8_8п^C|p_v9qLYݫ'꽡(;1gK#&r_}c"XbdJPBq'QB0B먞K +" _WG%,Az,eѸ"!5:2TO(nɰ.5P"[+z< $%?Ͼ dAۭˑ!&B4$u6KM m<}*:lmEe:$V,v+?I-HvəVƮXٵ):楻=4 ;EZ58L5\ uiKjhTzS} sQap۪No. pLUoX73gtg{2cWl Ba?Ẍ́xM7nV;`SNNY-ڿMisD/Rὤw3g1T.N4B>SB ~iA`r1:Hn'\Pj>ںnm>D;iM5Nкb3trbNx{NͺWnm>D7D)A%+dӶm8BZn^Y PŅN m-crͽv j|(O" 44,9'Ҙr[i\p,HA ҼN `p=RRh#`Sj)@ $S',9O~2Dre<2e~ ^[St'  6(tzuɢ8ͺׁb&R j1AERVDY i2t5+kET%RFn3 x?. pQM$3II;3AZ&36P\QxnDyΩD}6Nu&A 6$1N(F!ybA{DnNXQІ K r9e`{q%VRg`V#CѥzxYg a'!1QY!*cJׄH d%mǾZz ZQ9SJp+l8潞 Zoh͎ӊRbfһWi/% r}I@I0/ͯ!H9|#(e,4~/k6n:d\ }޵4(3 =4伻P&m&EL-LYh s]L2`9稧jj0̀y#d8a ud7d/HZP4骟31y!UO{go1ZhE_;o ;k\y{|w^7O(O#LĈ"$ZèH`x|ZxŭR$?Ivevib:X^959XkOeD*{Cjmg76J scȤ I' L6bfCdT2ۀUݔˮMZbY!MIgSCZNL)8P*9&!XLᒟ&]FmSRl:r :BF^}! q1 52aAJ-`l  iƊ P} y!J)}E 狭ҪXLLŇQ`f}[6q03OwE~T ֧ޯƦ6>WGwwQ<)S4K& MU?ODF`/:|Dl"n!K7< 3(90}letyÌI-`ZUR팗 |\FYfbcP)y<ܧDQݛ_[?toCJ~@-77Џ߈C(DJq|!p( !<8BHσF!c07-(.]ʶc2VuW+!KZ!C!  HN{!QA(A@~%?{,W0KIgeb HHq ^ ߧ!=5%ԯc>;1_<T`0> zs'u)߯@@DJ#f!$da|(igZKBi "_a"qjUq 0E`6ᎂy͎S=I1cqy_# m\>/wQNƊ%5qt?]\>VZ~9x<'R*$K&!RjԒ$9k|(#k5ȅqyjq C2۸|qyly-AazעmZ~ZNԳݕ vd/s" Ni<=HDj5B] ,OsbF,VU$_> w:}yh7MW! S׎yv N7?WZ4'(zc)D?E'N˹tty 'Zc AC*3jGO 9@pmMq&L0rsB3̓|l!SyӻQv$x䝟&*HH(%;U݇w~λӯw> Q=3I .'E!*C2Y!Fר˨T↭c/T"eQܙB_6hk2.q"@N=eėbP SXmĜX0Cy*\s\>w7/ Xvp,/egM0u+pEh<}YG]J8}L6AG1ܮбcTy. S5{Q޺[Վ^)TƯQ#:i(6DL#H1_:tYz;Ʒ͡u@SM$0aº1n;06f9/Ɩ S͗(N0׻<0qEߑLyx8VW$U50F4&=f%xo3?oS^6Ä9jzߋC"=(QST&A$EL 7u~MnE1ȣ곾<'*ns/-[r"ZHkJa V)<ڭSTʴ[2ڭ rݑ)n)SO(=ĵ>3n<#=Ig<|8^QԨ7z۽y7Á|9[9oޙNI\ٰI~OP) 33C5T3пq%;փmiJLf"0QJ9C8zP @2=o7_'[M-Ybámd]L :!Z";÷ple2칸IxMMN7Ugr36LO]#K7ʵ/x}G5fԛ}5?(2Q{TGQ>J/[#{-F"pTXF~ҭAN^{XNa)9WIr/ zo"VHp !JN" " CHP3)!ʄ'"$@C\PFE@Y"J. [xZ}I {o ax`6;9>\\*VGlͧkER&_K7>y6sǞC= ̃fm{EU?^XQrdD#3 #'R;O%s BeYoI|:cMRʿ\<aQ)->`;OE[~qo@r|9vݕUMfOSu.@o&ٱ1! ǹ Mjl"4G"CA0"~&vpo=Z.QCFBN3_)HT_a/zֵCU~#k4;=f1z+w4uPAV:l8])Ȝ&ӗ2aN=ѸrZ˴N-2ANd4*vgmSz!.3igËq,A$ñ=K*WxbuǮ΋=})$ =%*rt5JG1]%KY6":ۚT4Kڂ5AnU&9EIJWpe)s@@ /ŜkB!$4B H{! }P(FE/)@bwom͡³+49JV oMs"kGF};jASkgwmQBؔVhۻLe!TӞ!j1遭%Gu3,Fd.^BZsqNo*0!vHSBG==x o1O:1'UbJ2#Ӡզ_l΂@cɱǮ%T`T[)C4Tԇ>*3~d1V bAdc]o3Mw{kTyO>`Z1xOWia/u7F>1I(M{ +d  PVUFj[XEZX?{{afȄP>WY|%"46_j:^?`:꯼'҃NuP )dГ{H( H=x~%*GǗ +d"Auk`C@[̓hx6671&H#=C>|]M&(o1 Y!޴7d"%s%*7`Psh$1/ -e^ vTĵta8E맟JBTJKޠ'+[gsf~J>5yǹ'V0{O<9; No+h* X.F% |s: :9O? o} nkcyǙCɦڡ6"mJft0<}y_=WJC| Y@Oq+`JǞ'5t[ HPMi.lg֓D)a>q. bXcm* fR@g#Uys& Yi$_3QvQi"bK:KK\C,Sqr[3n=1nlb:[u㭜/gj E&zb.C̯9Buoqs$qJ_-;]lK$Uj%pv뻃4n]h ֟{ t=vIi=;hS e ܕD>-tڢ-{ 7 -Y$ ^K9>M;\K>gy[he|.AEP1 ^A +uGYQ\q@%81;1ƅv.z[öT1.0/.f\|%NkD(ffشI'BbݶXTens5[ rMMr![RKFKeKr::.v6ie,`]bƶtۭsWhI !Hb7l⯐@6+ʽȧޖ|M oE}wh4ʒ8ݙC~-׾zD/+Mڡ]EsZmT%VM628v m=GD{r\0Zfw@@&cڄQ:MroHk\^&c/n0wfu_+}EPO8hb>t11̼QX5?jI6yx܉:EYl+C5O#e6O:ox1 mW۩fLq U-1ע]ioG+&n ,#N|!578[MR҈ i1HbKT?utUwuzvbځv3/}c(6/c &޻T~]ڇ{ڌ|tsջq͞Ay4sC}-h94癹~Q`.LϽZg]pIj ƿيZg<7h~p>@GJ1bƈLn/KeHp#dk"ײ \s,T_=K!ȈLnkQlcXCˉTAeoηnīɗ, Go2*rq 4s\ @sl6CB2C 9Zِ[쬳c)F<%>2YP)i1cƹ("̣5N ss 5ܣ{%UG&X2#3N)>QmjU<ѐ"tK"`ڰotZt\/J#jZNcL!뾫, )Jv-}ſ.Pl$. 1Ο-9 ͧ?,6SkJz_5?IߒGy^1_}|1sq<9d'S%,gɧ8+AlsJrE]N=74$-#W! 9h5 oޏFhrB/zC܍{Qܡzq6\#27u饟0~%hA l D4Jj,d!G <-T( N 51Fװ(b6 ~(1}?#lO5~^2W\a3@*'3$@Wcr|D%>g5g9W0b^fe E>Wc x8T a^(|Xbȣ6 @[T m6 OxzztO izh<%:\'}=BN-~Ҏ'es3ꧻk|>$Fh1ӷ{W'cFZ(\<3҃>8\_,:UւUB ]#5bFp]oTlC%@*Eֈxlޏ,'H5Z*LEXUVGxW %LIVCP` ړ''ma r Dr&!Lrm#PAΘ0JH\WՀ*B \R#QY]5h%ؼ\θ4p!Yb23|4u |%U3E-9 _6 \`@k 5 s/!]f~ oXbm%}\o FfE٫?F?Ւq.|2C z?\_/Azf}? o/aV,,w*T Fg 1 -)uv y%/\)K\gWjR' |f݇_>F3;2M%Zjb(\zlByS})r cO BsS틭P@g vF~n~Ni1{rbW>?}U`$P6ZȪb@wػ_7.^@QD`X9Mf!@:(- \ bITkoB˜8R.9Ž%C F|T8m+Õ58,h\i%[lMO+1B]u=Z\PZ'] ;}NnwpKI⏦|E9+b piqw$z ƙ)Iʐ6 -02^D0HSER,)Nj XNaNQRrQh-xCsWpZMJq3]к\OST(* *3XVe-%FV )7h:M*Fä]PKir! Cyٔ 4*B)7,@v,␣zp,i'RT}j 䋬rJ*ٌOg֪)aݜ|r"![mV% IY+\. JNxQJRI'tόbQTZ)\1`KJf:)Tk7!A4L]5QRhE5F:hY;Gϒ SIhx~w7~ԚA\~T׼ B DCVvz٧Pr8|"Qr F 59wT+x9/簒!0Kw5_c+]%er{iS뉈ɇ1d[8-pl]7,u!R0ơ;%Nq0s ~%lw:<|A)y.En/h^?&$ᴣdףVhkPN\A|8ڧr38yKxB= K9uݩ KRe|@)'Kח9[YDmwjJ\ 61Zs^{)tUPf4 RC >*/HL;_7$5X#RSAџQ)춉Qm4뉑rgM @rwemH04%>=]w1O@(kTݞ(RtFNRvXXB%L$2":3X,Y!i6K!93kJGVzyEĜ oبME #BvqM9f[IA rk كaZ~C A4iL*Srm .!in[Zد.cE0MxlPQ|HTxi$}})/?ԍb1(qn76.1֙(im讽|unZTHjMz=EC, ɺp)F5`҆$ UG]Q4hj^ȺT\&]m6Wސڿ\zt\^ 5CyꮱQ⥭U]ΰ'\Lfwf!n8)d-+?a/"^ k vjxP zUn'#oƒx>͔˗RH kpETI+P]7B(Cp(Fj6-!NyX]٢L1$[ZO0gZFzVR\(Yhf<:`2M/Gm*“:p&Ck3d>(cT&J" GI*5yȷuW`GQj^/Ypq$!Oq(^&ҿq^n:$q67Ą j?}IQb zt0RxL0]_p7͓3 Ob]{- px|8Gܮ ]2PC};s[sH'B4B_!Q<9BNEf,7y3VCvҜ(ߋz%^NjoO# $j9L*j:pٽDg.՚'N9*".|ҵ@TB1ѷ/"['gnQzcXqMJ)‰P"MX5 w2:.&Dx(;yքmehL%]hoGYPZ桾l/iog9#hJ/7GoZuxJ+qc,&S$&wd;Kρ5ž$='s_7vě .G{1$fּV H]򁩞)qMm(] V>Mm;f4iV)/Iţ܇@'XֵnZ5}kjsI9, y~1]\U!Ne.w`- 5;V/ I1\ڱ~1ǃts_V@3.Y@s>H;]ҙv"i[M}II]ݽD"7twݍ;m ˍ2|`Tt|+[X䰣|P)>\2Ncy ~qqݑEŸ4st v-UE~z3RCc* iV^ZTIlwݚPNVlw9ګT^uL+Ns Ne#ZB:0H5BXrךfg*+%ě?ux+HN͚ odlGvkBBjooN-|L4lx8;8^(@1#0LsNL`8XabƓ44FJ(5#T)u5bC+xw %!$}RV NJq-B ;j["T3Z'( 9 UL>[k1c2\ 20\3+kNqo 6MaXԧL8o[vf +ъq8oRIk0"r#%-@t2)_VB}-/HغX-~z b6))ˉG DՁDC)fijA|5{xE+_CO4x?,~>MLȽ{\zm+`ߙ+22Ҍp8RTbK SjV;c-eD3RKTBBU:ОQ )w(n* TA@I%ѭ"'B?ٹtw"|\*!Xڝ!Dj!C|t*! GѺIYW{ +,OHp"8|Hc*J ]JRP0t$cg ߾]##auLҮd`-Q^7ӿhrI`dի~b:?峹 wUڀUakw,v4wR)^y,(v|XH c}!EX35K"kra!YB ٺ, F#F1QCcq xH?dMK(Qi-TH7eCl1lO%9Ѓܭ>B%i NR{ 捁Fzr1%R6hغQ#łt{h ր՛ɗ\:>qGgqd@8M=ix;/>0? ?<:l<-Wx0|NCܷbr~{58egxe{%8uzHZWSWZ!{VeO`o/v<:vp $jf!/E |/#w?:/wq+TbHjQ.ӽ |u=5t1tpͭG;P[4B5㢧/ujZׂ^uԍĤ&QVFFO]F!f=57D*Rh]JL+$]V) AP9gfS:9uVA%%T5gT};$T90 ֟_Q]6 cJ-AъYt?[#QB㪫 Zi&M/G..F+<Qk4d-ӵ 2V8 wu{H 6 ~Ń+0 o~1"OϮ eF'ӯEPvK9h2dLKUTEv|j=%K"І(?u*,GiL㹟ͨ"*۩S td>q(1ID`-PdL?e̸eKŸbe+R#Nx|4K^ {?_HgkS;=2t0/Y{9a}mxT 4+ĖѦ/F\)ܷLg*6Ǫ!/R}kv 75-k^-%-5wԏLө"+S=k{azY-)֪~烥D w;׿$Β"Rw؋ 2Nz": QLY3L )MqHSxXcANk N!?3#!)QF㊤Ҧ6@7jeU&8Tԫ4xS4,z:g-JA)DcT7+H$Z6ѱ;ؕ?URTPAP UfPa _ֵQWy{L!"ʼnjçz35 GR!B̜·$SGV:$?_c1Y{)viʲ$| H+B9ޫ|31/}~lĄhʂ"7#o`2}˿[$nKg3}5~­E4y-jW'e3` 9Pֻׁ)8!TM#HkX-!%g*X:Jؠ,Z1 Yvx4|5fMDE(KTbROeX3.5\9DւƬ#JSDUJHg$QgG:.j G\ -^zqU|)H!‰X/Lt~p3a{¿q<(҂rA*bM~vvT}ǂ0ap7~x6yZe]]c\/ʎx qܔs\Wy O& y"ZI$GRڍ! V(ѩRڭ=nK"=VavAB^V)mjb/v֕dv~jkknFҸ_\Y;35:yI+TV6H<"Esة- h4ݜ{=J2%fiI7^JW㬏V|uFOI]L1Rt.,>u|>⻋t,YAn6>a[?ӣwSѻ)}zӚ9enbb +u`'fQIsjR#VRd[hwۡŮ[)+\mU-%5m˚n[j2*kZZQybukV?O`Vk>-МpLfrdhK;_1]wL@ iN^Vaͪh(f/E mv{TVJ'2ݟɱ³s 3I: 2J`2[5X[I'9Dª\:M&PDľ jtB 5F#ɑcQAЗ1N &M6 6K!93kbeA=cnxE~ˠZR悙775&5{)uGׁQ Y}^9>])]Z f > qrAː2^R]3%CTvOhIN`h K3Ly`4o<%a.:vF*:>R V0ō jYPn1)ExL]@D,˃^ ?Y ZwT}F"04@YpP~I1@NBaBp޻9V~P^,A[)2n&p:WB%^K'PX;tQ ,C*"HuwV;SNY)囥974-żkݟW¦;>3ƨ!/9zچ<6)F}aI05;hN-;fh/VR 9&G4.Ҧd_KuecFcrg&|;s^ۑ69š~67{hq`%g'?݌9v7I݌N\"%YnZέp yj28sLNerfjjֽ3b+.F掭eWB_r%,'@A),?S<s:EDA{u'w\1m{ΧM{*h{S22*V3qlgFjD a;O>;\XN9?mԇ[Y*[j)9K>jl"@gUO#-`VEQQBW }_&5?fĪw M:]] ܧa}A.&ȩep;5;Pnܶ-U#n{Q.b䊢@[~q-uRQ[ ɷI֏ϓO=>Oiz%ۤU`pw~eeQyx4s2zd6pYqg[۴t~ \9;[z>vL 5C4pz t{~s|u3vCnIE}^Ι<<&CAۓO;~v =4?N JaPXq3.CWQlN2!7񯘥5lffש?cX5DzTHɌ Q+=]QV:oJXȈ([x:Ez@㍞&J̳tֳ6B {i~6tg&q)RT+^+2#1CAG! Y9@oV$%=Ci |6[?׃JU7}R/vkCˁ؄ZPiH)_.v9,l6R OkE,Ԓs3B5 _T|VFhsEV΁r}<#Ls)]SN3c޻-w6Ɖ[a6☀g7gd_auͼRo;]XC?ǭòkn5pe罳q瘴h`iS9˦CuzDE)rGj,[j29*0qeH2G{7MJ{긡RX6ОGъCGӳ̪h;*h蛅Nay|v~s9i:焆fSpU{?TR3#gkc`dkG&){<4-ÏW[zbp:3"&!0ZSN22;D;([\+Igg(HB/. -0+2((B"0\3e9y l$IB/=3>x^[X6lxYe#QZ,Hb[R1+3 Cb#M`.M;%.Ur Tke۟,T"w?_!ȁYTc1kw9∔|Gc~-GҸ* %abX\TsLu);~|Y\r,}80zbDž_OVʁX74yjYV b.>z0({"25BSJL"Z!| Qcy'""'*a S$9yz,>\R4'} c`*/Ș&.A:Ů8Z9ꛋ$v-l T̾"Ǚ[[EIGPU3=δ Q̎)M<ʡ?Q5ӑ_%F )ΨiϠ} smTm4.EgQPt gisx}h|/QCl,H/;+dq5/?\x$fK[Ni:bs==w,>92[XC˥c.xbF; ")0rNNW韹}n RK͛R5kX!Aa6(Ga&92QrLG ÚrhF7*=Mqk BpB|(A:x/ՠR@24!<;,DropϳBfYaI^Ӊ$ 2ۙ_#"Mfeg{'?:q ri,ڭ{4Y%]bc+]|{}z|q)-%[_QGd4B߮yTx nEǛFl5D1gTSgԏzX(8VcWQL^lI[akCA Y/{ʛcjrR9Q#3YaCBf-')C!FbRTqdUUj >9ʹb*J:K s #.s{f[2n$^hYRQq}0}:ehԈDG dx{nT#=)L2A7)l3!I#FB+~ cFSAd n_j2Y`T !xt0JKLJ%-` A11ɪ^&0r9!ٲs5-5'Mk=lRSKطa>*C\\3},|v{AP>yv~xplJu{(H!Xظ =4 1'hȄ0CeU>VsXD1"iZy)/[zsB=n*Tp_v-6/wRĉjPMP{t GA3!RȑRilXITĂzSEt$fJS 30XZ#Dux}& QCȟ,~)a]j6&vk154[}L~_ΧlV%~0Xu܆}߹A!H[Y ٕfgC=X;fےSP 4yZ`ğHet`%q%.k/2Ž%&DwZ{jiiiiXV})I23,~KRN`8ɭBgKEVe\΢Nqܪ*5 KFJqZbl:],υϹCv\ XHai<ɝ R4SFb7eMP  &YH1*e&IlB ;LƱaE:2׼ޡ,<*Q?a`Өy}6[mlI `9S9S-pi5ZqLb JW9zhW3bK+oW')_#ì~%hfLq>/7Nظ$6$HҊm!'h`k&ה{_(O%$cګ}T_nP-:٫^+]BJ;UQ}NB$}RrTx.Hi\\xRu^[LoRibM脔rjS}MJD(ݧq|(A@X솸w8a]&gדK\SwQԯ+~T0 FT_P i?cBdb`gί"@XV5넞rj4=i4)EzǔRҤZ$}R"21O22'c^<<R :-XI).^YcJ)iRZPd')R&RJi'4)-Vli4)e+ )e,MJ $RyCA.H)OKi'Ro޳"퀩&%^ υՓܷ>A+cp?6ZIDS3ΩDZT&R!Ή18wX~ Vxn1U/(ݵ̻+xBu>}fr5DʛI?UKˡhDbQ7EMnR[|\1A30fhC9mqR_n;P1G7g!&h;z(ܦ8YLf7C0D4Udq*R]J S!˨ iMNV T\%Թ7\:AD],' b5h~Ԍ׹"X)2s#"H![ʉBt!uމ,"66T"j9bQ#jLBhiHi'HYa{3 QeYh˦)B^OX)mT.$H[갧Rsbrux{Tthse(d(/)Ƣp-ːXK-fci 7R9ʲZeB3P)l} dePe(e&HGgX́rB) ]{#NIj(~L_<'9d T4T#m; 8ͯeA"%)J"GvHY $2Rw4/^{Z` 5kushɓP|b c-F-'8#ǃn4٨fTA7BWd#ۍMqG( `z_D1,D 'S{LKx%[,h\5rr3FLXg;Gm#,Zmpuٝ$'vU?Fϣ1ZspZrswrr_`OVVY~R=$^%swV\Y[Wu2'Xg7YhCG?wp,8 j&t=s93 =;۷;<m7V;,2 ێQt>э~[};2P/~?= Tt32r5 L35dP92IӨ38'y`QkOFmȣxyakiPAd>>:{\w- xq 4#ї >V#],.Z,fY  Z0q$5Gܺg/͖,_l(˖(W;e=iW5:Eѯ<V˃թu;2h֭к!_='֍ OhjyP:}tn'" Tͺ՗`Z>4+WѽuK*VzW;[k~ْv;+ sCi x'[f屾k}?,F`֧GTO6*\s|$=kJfczU˭,X|pbrysl"iN-j*qJC(vrlvV ESOx,GRaʣPχѾXhb.'`nղ`l5\39^[^^h'̷%'OeQ!Kb &g:1ǒnJό[ ,YC yP/S(/ŭH49B!2v$od$n0ܰھbK[,'CF*\PZ<cT}[;H9a yY~~n(j9)L/ѩ-8.Vbay{@!E'O` G.:L%Df|g+ (g~*"}a!_$ Xi)E *uu0%_ Pk B%e494k`ziânAq诃cG`Pm}žS?y`pdbt,|~f0i0)X8%ZY# kcs:94+MְH& Bq<lz%o2쬐l( aa2f' sqi `aR@56i҉{ gpkȨnUZk_wk!Hg7hϩT\Iv׺HMa XeZ0J }b3Q#0AbPp-FwzP.iԖK 3Tn屍 Fٲ;X1_^{x79-n~N`ChdaLw8-oJXjUC rAPu'c[Xhdi E둽 7 y5~ͷ\}'j{vgJ@2!-YvNb\tډX~% (# ؐr} `<,gs ԂbrtX{;j候,TǕ?FNv񀋺g9\B+*zHr0Zn] -zR=^I2ͿJI agta.G]ú2DS;LQ̓2u$c4R|n,-,/7lQ}Şw:Pc# [b7Z=j4m-F]8azDbjN[8tjQqϹοOF-ZAWe@NfSL[K2W6!_' |"u֭ESG'v2 dJ[ueuACrҩZzOuAź20TBIuCCrҩrsƆ\y seY@JZxdr=2wXK)pFO.9D= gU9,N̗ԐJ쬣T,=CRPX#:~DO) ;SΓ>HEcsD }pd iydR:>/`LwjwOpuv5\.'@/ը:@&fϑM NyE( 9sºNZ]r>VfI*;1t`o򶼈HB I*nLx:[u oӻ,L@~rDD+۬&6 $b_dQx!#Th5ZJLk"ͯ??dA3|4 03 œ?eL ^-Y-Y-Y-m\J4erG)3ܫec KB $h&:ɢ#S# Bogk;78[[Ycn''Ys@7 OKT M- kO,cCC%3D {*WFcpY0$nOdoB{DUD54hL̖5ocS!SDõ$zK9q @J)2&-:#j(T9$R [w!+ѡTrn(ߍd ;+=UQ`)rCX) eē D2)EVj#xŕPֺMCтoEб+̟V7`b`L6ҁ. up;0 X%Z~$?mJIxBm*9W=8]AУ(4أ>]" c g⳩ ʥ)#0zB<Ɉ4Iri8Z5yrk+PXa¡7t sBExs)b. W %&H{4@}oc){@ ݗ3q@II}5~AT뺉#Qu4!rbd^Ͱ{!v(0t O`Ф@2<2>p5gd:aVF!nH&9,P%LC9jй"Ro gDl9I9D[;'"q籚 ;91}M&C;AWO'"w4ccVc8<*48`z(?5t5w` ;W~n籈=8nhP Q2ǀ(1oz5v5 6C S!xxňR/'$A=CS-=sAqw@X0BOLG*Xs8fR5U πfkf7YNs(Ô",1[mWM&y֋ W\FX,*("b2R)O0ԳI>#ɪVڛa`j.뱥m( T:Xescr:)^)>B`&ũ٢Ia.b옏Zh#$r:id9Oƍ] ٜŵ]7R 1<1LYUr 1!J@ B5W2U_N퍏農)~0&gWO 1T$8f(MF&u] $jFΕ$2&Hl iI$y q#:I%S}"C.=G$WlHr$!>G: #aIY(ţ{`셸j88Q=o5\S* o!#׳o?!ca&xmPM"Lb.ju\'~둊4, `?!JN<7CĽ˲F4UcIJ5J.DŸC\X ﵽLƚaEG\4&nF7zX5WTMMH!FERW\PC^U{k.nj%Ǚ(CHd9m-S"b6XoqD-D|82ox\\Iޞ qt)J6M(M 1η 1 !X.x°\pGUF\aeDv CJFϓf(8 C^wm$bIDž#ђ hn~ؖZf,3-Cb.jU=Lߩ|5 م٦b-IxY| ?a7dw吭U3ˇw)VAɞTR$/d.(LP<+l6w,KmDfjo7 Pb&$SRܓ.'' 2/Cgj2 Fce)!mɔ3q^,]ssk}\5R6-ɓ|a1Edm0>)co\z!tJmOQ% H3'U(δo3mgݬ!+"O`6R ż %v.].ؗvvt*mB"=,J*+^sO'm1U_l6(N$fu`r"(, H\(XgH?ޏ{tDrB(0i"cI4Z$3][. ;OĐBI9vv7û&1d-E8+ʛle!~`aeO1_^f_{Tۻ&f7̈PKyN Edd7tPAF!mَyY08g##]m1JgkSL'R_ d ۡTD64eL&A b4Ӧqr@A (SmM}1WTF;c&eC!)Hj%xI>tV]fvli|2)̖RG\d|2m䆅kůxuCLl 6#_E ב 3295q@!2Is )DWcè OJ>vH% w/u6v%v}s2&2Ln$8$ZA#e 0̭7w '𯀏d[$lЫ1h%gnL&*z/A`|?Yg`v$,-h'NXռ tCУf0um5?hjyNFI1hļt1栥Qι#psMk}X]\_\4l:em`")UVWd8~gts׋`K+d[FiR[t𔳨s_AzN@Ai/P vVn4@gյOFI~UZ@z}(A7~εɰY6rdIkkɨc6DAE_|ǕJ`KtF 3RF$ld1v{}x}a.$2Z@OZdTX!ZrK6C;e>õ_}S1Ck B'@ bvm-4Y&@GV9Ⱦvdesd:]c}v B O_?GR(ҏ,D`G|;" fd­G@M|zrҾw%ͽ]];M7X4q~q1/k~gr&<8,X/F:ѷ,d2kXnC$ [?M9H8/O;FTH &2,c%;xa_b~#=0 `ۖMk-T ]'~i~;- iNݝM^2bC͒55g?ՓOl ?8ZIO.$#$VCf,W%5Ŏl4zao&1Dϡ~9sr7g_dy[qsVfM`S6Y:&٬EL2N2@kĜ4A~};O˫&xlb _*P`E>,W֏L`RNo/<+VtmTb0D {󪱎,0~S HX_Vnuq` r6L) v+FrNF%[y˵_\%&WgW͉XKV| 7u\F\LXJȖ{q)#rJlZF.%x'ۢHwY&F& qSrW2p ix45]ݝ6di.c\@)RlM%co&9mCM{1?ѴdVoCy4Y9S2HLTd4<b)CY)kQEN&4~Q؝#MY6ci3z|qy]m.!0Wg'a5X|SiYw=/][oɱ+^T !9fO^}hQLRSMR҈ Ej^E Fonѻn\ZhV^yp6._}K(m#u%*}{…%E h` 'Ƽڕ P/+8KU^KEx]jbUÐ) .}]O㈮o wjgᏟfhZy6:U@v 4n z0s~~൵R}{KQ]bsd;./(281i˜\ERV>GYv+;uޣuOAvmkqRfsgfз6xp[m|MYvG!jy_-/ۏAxSƅaSv,o^,ha@j3O揃D-uUgSѓq,Wei9G!}K|/RF9, o<,J%U%f)a9a6qmj&s(s4e-BiS^3\,x~^p:Ma;d (`YDZ~-yJa0?}}tSYgnlY=`vT(a"*&b(=¤zTb@җ:1?LFZdfÒZr{$$k!?N=:=v!0 v]}XZ}XPc?i` v5eVư^]iz^j GTX}5!#ٞ:un Ss説F]i,UY^DCs٘WrV65jh3jIa&rQhj+}^ 薉\ry iTW^L,IP$*lL$p 䂼c!8 DÕж?TF Fb6K/JUҲC˵`sYuM %;'mfci 料^US [{LM6EU!5u78͍mdq>T$9IN0r߃n5>,,H%W臁NĠDg (2F5S`&=3u46,u֣1&a$МrZxQ[QsQ5aw\JO S3/kQ_\E,HQ88UK$\zak;$ }a,a JyxdUIFLpB6KuQZk6QH0/Ri_qFP݌9/ zNPksiNF :d')(_wɉtJ&c:d$-yP\rT|֐3[m@b<6CZ׾\V3LǼ%Vj!4S_7(F͙dI ڪ# A45wvnE ܰ'Al @ aLc!AlMK*rY|(0Ԕ"-"T2mehU{g.çwwˬgh5a{wMJ Mu[Ldlp7.kunu<*=\Țj7zOM7X7G]Hn{}h#5\yE9^5B@5jE`\|:)CP2p%A@%IH&%I |qn*E=jl^=ׇ4k!C.5;J>E6^Gt Wv&sfmq;Ai3Q!bT'jH21HM ui:WŁ0|QWQhw}!p^9.aFMW}?׶di~]d}"w 4]g{3h|3K'ӛ=EEX\;;C{PmErſNߝđ,w, qͼk'L4~yBȌVŻړi //{~ZLVR81ZdꂼIrB{)=oƨfN[m+U{?< 蠢qk*qcvIR`s)L wf" @mtLFI"F#"h\p K^3XkDž̗3mF\ mso g 孻,3G,?X5Z$ެVS/,R.?G/$LjZ:??Ki8|w?Fg<(w?;E2/f< wP? ;g.XV᪂+_ 7 ><9y|ASVNWеx>ԲrlQs$Ϫ]$Le:?--FмNKN;Y5|MT/zf_-MN0jT%Sޑ -.d^ =d>sõl~h7I9t3zy4+3i2b}وL1knW@Z|_,C~Zrkf (P\x*QoW&lV~6Q`W:9@1T/u O m{Z=x7?wC\KfڒWHKobR{at佑FcBI]noZg$lݤ/UfEL{MziO7clG˞*گ;:h$K+MY2 ȰC2T-l~$2`ZG TJFn}FQ2jc^(օ@/5TtHPٳeiw~hԆ=j"G\"d#R1+nIuX1%S';]ILӛqļj͢| Ja__rN|gF^+4_)Qa}<ZtZ)H.gw#2Ve7| 뇘sɗĽms;󴨨ru֑MMyw!x2c:m?w{贒ڗn]XDؔpcf~ VuLg4n):4JۻQӻua!sݰnNknZ[;9k@.vk;U380<[CL ߚ`5Vhl<] %.LƩ:·f,[CTu{}eRqǹKy:5 )Bchz]`P,+LEO <7V;bJ¨$`i$(06Ď/s&ON31(jy#D͐kεy;DK&߻7٩|A%1sT[?KE&RЖ勮t4z0`diqCq.xlj& 1f) R2x8M1^nv϶L~|y􆌸2u+8}?I >Ź~ygCMD{h@$D ɇؙ"Uz| e$>q褦Q:I8/q "q\':n1aF_<e^"J?adTX9 ѤDѪ3:m3H`c!!D!0ԶhgS `?νaYE_8U?=M'R{L,ᓋ8$B#,\8b 7ʐ]/h2O'Cr:AG7DH*OH70PYcr4mFk3j03>ǑvAFXizF.^YXRɸ0=_24$|Xv| $ 1:x͔T\_? =NdtlxLfB/KMv۽XT.x)U^lΘ#ZCt c`)ΰYoo虐Z|> ,E.Af= Hkmg W k6Odg% Wrr=L6^.[B \["Tٸj]IBogkU!KruVǐkQ1(c03&@/h UP4.0wx#Qɹdr]n +?ٻ6xSYU~*+>ΓSl!$gWbĮ@RݾZ0r:Z]q³rj6<4>ǧEnKnX4B!'Ea5Z ޵T }Oh`e^/b3r~`͡Wd3 `ٰr]p ǾP&>)OlR??ї5 =\p6TV2A2 0Dy7˃ν\x 185Ҁ }AheF`/i[Ɯs(|):͊]uwdۂOj%aI-4(9~@H/D 9k uKףq jʼP/g]vPAM}1";jaE6Y=@bg]!RYZBcB3ٖ ^u1? -_QEyt /]<@" @ 弎LkHYF'b|ƙJClGnn]QLR{e pN#a5iw%_e'ΝZFϊg]eЭ 6H}# ֟ ;Z(Bh V h׊qV`_]NT]+_Y 8+0O8+.bNJq2rqHTy s,ܭ|)%]+Xp3ޭ|k,=~ J[CLMߥxj%}TޱI)ispzr8;I!NF;8;Q =a.EyE0)uh5e |R}H7DF֦Byj B,MڔKYv~!bf_ˈ"\^Ar;8F3N-ľtUTfOu-\9 {v sQcha, ΥIoӘe&u|W8j }r] M'VK Յ|`᣹,f8}O> V}t-m 06yڲY_0J-,M͛oVYNٿ,=`q {鿽m"d-kM!('}WwͳmN Q,2w{MY< >/` CQyBatWM$A3/g"%ܡ"XW#sEH([EH+p0٩"|e+BZAa!wiPEoPN{C i9zԭ)=~IhT|d㧮c iW8'VFJe8*V]zR)~l)a8LJ 0)[c%\o6mꚥ2f^|YB wKZ=M3-x! [neE ŷ,cКYFs}W/⫈s! E bg,\aHpjSwl,L["'+8*Hf |rHྐྵ~s؎\ٍe؂BF@:=3g Np jCRkޏM?5o:IRc(\FdtJn_V5w}x3J`WbJ$կIfJa5҄jᛣlfJ¼R`A3@ r+dzsLW:d|0>j|t;l2 Sܧ! %wB96<ԫIK<n+]7}Cϊ#k:-k/qy\j# {{Quû<2<)j8xİ2_&#쁛Oas0St &)NД d~UO=*Q~cjG7i{9<}?`R>]YߥRcǬ, 3␵1F;4= I7O("~#[jP* ?nfNg2ƲS,Pϕ @~;7'`B*RrrPY;RH_a$%DPj l% ZH5I5NV} _{Kޢ" fFx'p;Ķo{n, V`Wнl)c p|7x"< idHzpN"U"u>eK2eK5ԝUe`4=zR ^mB蹑"Vdm"~qpm#B̈? O.o #W}\1pGU7;ϟa]YsZ~Tw2k N\枍G6{ՋmV]ٲ88Y[|q19Ji8{@'} PjVm/=va';3>>6k.۔* \tM8"U-ݱA*.G 4wIw yJ>hy\ }F)\1wMPkvwŋOoH2^OWAje28/_/d7e-2a[Qpd PWyɅ߫U,CFXP6 ӥ=<:eʹs0"y, dXV`^=?{757ټvV!"Q:8|;R|ɳ& ƹleQŒ F[)L%,QjyR-4S0X UD:bd`Y|% RJ2 Ń `F*G yQ$Hrs$`a4d:5GޣtQJhJ M'fr#*ΗS}:O5h_}(;`UV_ 6yg5-G)JFgYD=ۣtP381[N5l7Jq&J1=RZfT>9y ~r%;ˉ='Vrѻz+taqxHOfZJ6zg0oZouP^-%כÎ\gnXBy6Xk}] Zj ,1,G0j#-4ҶRqݧ]mZI[3NڙU]o S>5M7Ͽ7R#u)Iҧ{}uU0쏚َY{#jciHEp>-z UkEF=0cWMJ6ID7U]]]]fC_t9#¬ oE_o*9΂_^[SE:Zy2 RhRT8x UD+™-o4CGn |7OhbxB#b\;. #jѐ^XX* 2SHRBA,`/L<61 J ΈS#K)`(d.`$O1ձb)¹R`ke4X9 N2,"Da#ܫsR$c{zwZ.]4?66OGٺ<C])aRldfa>ov~5 !jЏ~Oic &ۆ#E+cca\񇿎n~ ,e O-xj(&BS-xD9Em_)e'vl.4rl3 Qu8Q* q׍2dDs*4)k#(%Dؠ^pX>gCq:9#;x%AT{56geᕸ4EFB!~lH_OxzX42k33p&lX6$BxZ ŝGEtE9(E'n;!9oM&ROE俽LI ^*}XOWy0, +/J՛޹lg)ъT3DǎUzDYS{z%ҕ^R\=%jN-ߏX.Y{G>Ո3q %vBvlfG`2/Wro)ɬ*.Y^f^a$m_̪! y" Sa{K@IT[(.֝u8GHLCG?(!G5=1'FB Umo~(kS4]W@}q+X?/4duss TS\-EupAT3NuDk)ɇX`>O[JY͠Jq1$B~]S-ōX(NLO9nF,qo@B^T=Pi 脾GugwYVZ[WtnMH RYy8;nN;|[+gskڎڭ y"Z[^")alv7;NN8.4qϤ9Hy'$1a,(MALA$1*g~qt r``۵cm.# l%1te\Ͽ,EP/f:g:[ g@DWn#F#/ {3|v}[W/ 35M@&wv;ĩ}n鐋H Oa&$SP+R&(w Qb6JIQ gi,yӘSO01j,O&sBV{: .N3AT|`T:Sv}/POĮ7qZhttc  Nr5#%cIS>BQW,ip0Lt,ip0hk+b!L!TtB&qRٱfKʋR%SP^R*m;Qw^@lqW gSL?v57Gj/e_ ZR-8G.#Rk@8m?Ҋ |̾ޞG7Eh/%OƏ&Vju _WerÙV)7<1C}L %K;,N8Z@T\.]&eY?;SjG;dbsfǠ}=Nl:,.*]`uYM}g0gNQqH-Epw_>͏"L{s W.rA^ ^w=9()*ܚ\XL5ӷEcjBP5,U:198Rfy.ej'~%<#! ^z޶ۍQp`0 |F!2nReǙHXa&ye8YFITc,lI[Q9Tܕ-E 8$485PD61tR8@(ckɿ aGHpB1.b 8ٸLfB's_glyow'y֮MjPy"sl4S & 'A eIP &̭Gy |PUbbR6r]uY+8p<R:[kϔ /^u Pq=WlI[+d5w>9}\P\}uAtaY>mmj9=Zò$~Ho-Bc!;K w=$_B[~pڑpvWmu !_PTۻ>`%=>5S+5'Z y""SoXMPދ 3N;|[wFi%-<7Q5!!/\DW`yuH {&'g/Zht7zlXɃjxhΕ=2 *~qllՠx7?uhܐ?P~5d)Xm2S ʰlH~ UBT'&v$W$50qՇP \B"`y*k1n84!!/\DCdӇM^7b":n^S /6ݚ.A2UfvjP \D'>ڭSe[EtO0l+\t:?2OyɝFXh]3hnb٪,̲#PXK8T4ڹu<ٺ2ye6HЮDe`o>Vs[.K%(c:lfxRe}^|,&>JeMUFW-t;݊Sx#o,דU36XSr,Q%_W=AAb6玼(4pxxgJ ɫw*kU'#ar Li?Ga_i)ʅ ^3+m7f=_=jjs"(9e2V4A<܉=G7c'{C TZSQ 'DI8É map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,LastTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.380093 4773 server.go:460] "Adding debug handlers to kubelet server" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.380852 4773 factory.go:153] Registering CRI-O factory Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.380958 4773 factory.go:221] Registration of the crio container factory successfully Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.381040 4773 factory.go:103] Registering Raw factory Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.381106 4773 manager.go:1196] Started watching for new ooms in manager Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.382795 4773 manager.go:319] Starting recovery of all containers Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390674 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390768 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390799 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390825 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390853 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390877 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390905 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390962 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390994 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391022 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391048 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391078 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391107 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391138 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391166 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391193 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391318 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391349 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391376 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391424 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391452 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391480 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391507 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391534 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391560 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391587 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391621 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391659 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391694 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391720 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391768 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391796 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391822 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391850 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391875 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391898 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391924 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391982 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392007 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392031 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392056 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392082 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392107 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392133 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392160 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392186 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392213 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392242 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392266 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392291 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392317 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392344 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392379 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392406 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392434 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392465 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392492 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392519 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392546 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392572 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392597 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392621 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392646 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392672 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392697 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392727 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392752 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392809 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392836 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392865 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392890 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392915 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393090 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393129 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393157 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393187 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393215 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393240 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393266 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393294 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393319 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393343 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393435 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393463 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393489 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393516 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393542 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393569 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393596 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393624 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393649 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393675 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393705 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393730 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393756 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393784 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393809 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393835 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393859 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393884 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394831 4773 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394897 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394961 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394992 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395019 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395059 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395090 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395184 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395228 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395258 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395286 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395316 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395343 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395369 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395397 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395424 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395448 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395474 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395498 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395523 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395549 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395576 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395602 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395627 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395653 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395680 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395706 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395735 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395761 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395786 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395811 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395836 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395863 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395887 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395912 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395975 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396003 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396028 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396058 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396083 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396113 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396140 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396164 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396192 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396218 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396243 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396268 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396290 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396315 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396340 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396365 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396390 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396418 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396445 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396471 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396500 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396524 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396549 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396573 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396599 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396626 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396651 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396677 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396702 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396728 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396753 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396777 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396802 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396830 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396857 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396882 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396910 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396971 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396998 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397024 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397051 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397076 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397100 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397124 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397155 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397183 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397210 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397239 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397266 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397292 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397316 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397341 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397365 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397391 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397414 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397441 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397469 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397495 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397520 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397547 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397574 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397597 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397620 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397649 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397674 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397699 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397725 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397751 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397774 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397797 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397822 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397847 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397875 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397899 4773 reconstruct.go:97] "Volume reconstruction finished" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397917 4773 reconciler.go:26] "Reconciler: start to sync state" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.415196 4773 manager.go:324] Recovery completed Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.425136 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.427746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.427812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.427847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.428762 4773 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.428792 4773 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.428825 4773 state_mem.go:36] "Initialized new in-memory state store" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.442348 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.445684 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.445747 4773 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.445785 4773 kubelet.go:2335] "Starting kubelet main sync loop" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.445857 4773 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.447019 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.447117 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.455047 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.543117 4773 policy_none.go:49] "None policy: Start" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.544801 4773 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.544830 4773 state_mem.go:35] "Initializing new in-memory state store" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.546425 4773 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.555186 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.576401 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="400ms" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617023 4773 manager.go:334] "Starting Device Plugin manager" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617251 4773 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617281 4773 server.go:79] "Starting device plugin registration server" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617849 4773 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617877 4773 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.618166 4773 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.618294 4773 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.618308 4773 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.636438 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.718335 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719877 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.720430 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.746711 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.746846 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748226 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748567 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748594 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749242 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749630 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749658 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750437 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750747 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750769 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751939 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752162 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752237 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753554 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753585 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.754401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.754431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.754443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.802904 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803000 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803069 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803182 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803244 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803265 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803401 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803426 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803449 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803514 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803545 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904684 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904828 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904870 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904904 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904997 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905031 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905064 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905098 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905186 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905219 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905134 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905136 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905191 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905319 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905294 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905523 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905569 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905609 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905709 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905729 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905921 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.920676 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922409 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.923073 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.978334 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="800ms" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.079195 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.101649 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.115519 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9 WatchSource:0}: Error finding container 572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9: Status 404 returned error can't find the container with id 572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9 Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.116857 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.123260 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508 WatchSource:0}: Error finding container be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508: Status 404 returned error can't find the container with id be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508 Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.124503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.133336 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.139493 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10 WatchSource:0}: Error finding container 495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10: Status 404 returned error can't find the container with id 495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10 Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.156580 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25 WatchSource:0}: Error finding container 8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25: Status 404 returned error can't find the container with id 8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25 Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.163480 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249 WatchSource:0}: Error finding container ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249: Status 404 returned error can't find the container with id ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249 Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.304892 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.304994 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.323754 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325190 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325280 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.325692 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.353576 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.354602 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:26:34.395298313 +0000 UTC Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.449568 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.451004 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.454523 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.458233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.458996 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10"} Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.643195 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.643262 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.752463 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.752566 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.764688 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.764773 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.779734 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="1.6s" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.125765 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127997 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:09 crc kubenswrapper[4773]: E0120 18:30:09.128510 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.338402 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 18:30:09 crc kubenswrapper[4773]: E0120 18:30:09.340286 4773 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.353488 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.355583 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:38:31.360476695 +0000 UTC Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.465789 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.465885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.466015 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.467962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468669 4773 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a30a34145c90c9e4085aa6a6d5e9eb324c01b4b0ede8fb29198bef6137035672" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468787 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a30a34145c90c9e4085aa6a6d5e9eb324c01b4b0ede8fb29198bef6137035672"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468984 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470799 4773 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470903 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.475501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.475548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.475564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478699 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478765 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478778 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478790 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478735 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.479771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.479817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.479835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.480039 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.480086 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.480192 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.481033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.481159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.481244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.493793 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.495017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.495056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.495069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.353580 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.356593 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 18:27:29.505038803 +0000 UTC Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.380507 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="3.2s" Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.433901 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188c83ecf24addc4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,LastTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485422 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485464 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.486975 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae" exitCode=0 Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487035 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487081 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.488718 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cf6d2f31f31a43668f18b56290a5fc01f7f390dd941e4eb69323b5511b6d895f"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.488786 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.489713 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.489740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.489749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490792 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490815 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490828 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490833 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491227 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.497915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.497975 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.497992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: W0120 18:30:10.711315 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.711396 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.729730 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731204 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.731711 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.356779 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:32:25.209365633 +0000 UTC Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495111 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4" exitCode=0 Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495169 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4"} Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495192 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.499699 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.500248 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501735 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f"} Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502118 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502089 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.503753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.503779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.503791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.223649 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.261989 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.357594 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:04:54.606780821 +0000 UTC Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506736 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506751 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506865 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.358513 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 09:00:44.866752855 +0000 UTC Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.422318 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.520408 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda"} Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.520489 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.520498 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.521750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.521777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.521789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.522565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.522664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.522695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.932165 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934196 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.358991 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 01:11:21.982905387 +0000 UTC Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.523769 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525193 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.359668 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 02:40:13.026350435 +0000 UTC Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.527111 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.528462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.528489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.528502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.359850 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 14:45:48.363307144 +0000 UTC Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.755207 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.755563 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.757645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.757708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.757728 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.279568 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.360959 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 08:34:57.29075926 +0000 UTC Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.432287 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.440922 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.533461 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.534754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.534811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.534830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:17 crc kubenswrapper[4773]: E0120 18:30:17.637436 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.838738 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.361274 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 23:02:04.199073323 +0000 UTC Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.536209 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.538347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.538393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.538415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.545424 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.362358 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 07:37:55.966396415 +0000 UTC Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.539097 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.540018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.540080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.540099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.798435 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.798625 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.799784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.799819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.799829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.363112 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 23:58:19.076771697 +0000 UTC Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.541876 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.543114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.543143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.543152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.838807 4773 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.838880 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 18:30:20 crc kubenswrapper[4773]: W0120 18:30:20.876749 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.876833 4773 trace.go:236] Trace[836220209]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:10.875) (total time: 10001ms): Jan 20 18:30:20 crc kubenswrapper[4773]: Trace[836220209]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:30:20.876) Jan 20 18:30:20 crc kubenswrapper[4773]: Trace[836220209]: [10.001571701s] [10.001571701s] END Jan 20 18:30:20 crc kubenswrapper[4773]: E0120 18:30:20.876853 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 20 18:30:21 crc kubenswrapper[4773]: W0120 18:30:21.002332 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.002450 4773 trace.go:236] Trace[449536495]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:11.001) (total time: 10001ms): Jan 20 18:30:21 crc kubenswrapper[4773]: Trace[449536495]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:30:21.002) Jan 20 18:30:21 crc kubenswrapper[4773]: Trace[449536495]: [10.001245205s] [10.001245205s] END Jan 20 18:30:21 crc kubenswrapper[4773]: E0120 18:30:21.002476 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.261731 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.261802 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.265628 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.265677 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.364002 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 21:02:43.023646126 +0000 UTC Jan 20 18:30:22 crc kubenswrapper[4773]: I0120 18:30:22.279661 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]log ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]etcd ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/generic-apiserver-start-informers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-filter ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-apiextensions-informers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-apiextensions-controllers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/crd-informer-synced ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-system-namespaces-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 20 18:30:22 crc kubenswrapper[4773]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/bootstrap-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-kube-aggregator-informers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-registration-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-discovery-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]autoregister-completion ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-openapi-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: livez check failed Jan 20 18:30:22 crc kubenswrapper[4773]: I0120 18:30:22.279797 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:30:22 crc kubenswrapper[4773]: I0120 18:30:22.365085 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 12:30:14.470622816 +0000 UTC Jan 20 18:30:23 crc kubenswrapper[4773]: I0120 18:30:23.365272 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 00:00:26.103562578 +0000 UTC Jan 20 18:30:24 crc kubenswrapper[4773]: I0120 18:30:24.366420 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 06:56:38.061066167 +0000 UTC Jan 20 18:30:24 crc kubenswrapper[4773]: I0120 18:30:24.837436 4773 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:25 crc kubenswrapper[4773]: I0120 18:30:25.366677 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 08:53:20.863884946 +0000 UTC Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.252203 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254036 4773 trace.go:236] Trace[38491623]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:14.421) (total time: 11832ms): Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[38491623]: ---"Objects listed" error: 11832ms (18:30:26.253) Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[38491623]: [11.832718825s] [11.832718825s] END Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254063 4773 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254472 4773 trace.go:236] Trace[89087432]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:11.458) (total time: 14795ms): Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[89087432]: ---"Objects listed" error: 14795ms (18:30:26.254) Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[89087432]: [14.795846736s] [14.795846736s] END Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254495 4773 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.255560 4773 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.256032 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.258887 4773 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.274959 4773 csr.go:261] certificate signing request csr-l9ws6 is approved, waiting to be issued Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.283177 4773 csr.go:257] certificate signing request csr-l9ws6 is issued Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.356074 4773 apiserver.go:52] "Watching apiserver" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.359498 4773 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.359746 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360296 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.360357 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360723 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360801 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.361011 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.361131 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.361206 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.361264 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.364611 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.364824 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.364948 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366577 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366727 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366736 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366815 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366813 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366845 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 12:37:42.880814694 +0000 UTC Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.378112 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.416342 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.434246 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.447498 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456040 4773 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456341 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456403 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456428 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456451 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456474 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456496 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456553 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456574 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456593 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456614 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456633 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456652 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456703 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456706 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456744 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456764 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456813 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456830 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456864 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456892 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456963 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456987 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457010 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457039 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457056 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457064 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457092 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457165 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457210 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457275 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457295 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457312 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457329 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457349 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457368 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457387 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457387 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457410 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457449 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457466 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457501 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457517 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457553 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457588 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457633 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457653 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457670 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457712 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457730 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457822 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457843 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457864 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457900 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457964 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457997 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458027 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458049 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458071 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458113 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458140 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458161 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458180 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458207 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458226 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458245 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458268 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458288 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458311 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458339 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458363 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458381 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458399 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458417 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458437 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458455 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458502 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458520 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458538 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458574 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458608 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458629 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458650 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458668 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458687 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458710 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458727 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458745 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458783 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458820 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458839 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458858 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458875 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458892 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458911 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458954 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458976 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458992 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459012 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459030 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459052 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459075 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459096 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459114 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459134 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459169 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459209 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457583 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459231 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457609 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457772 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457887 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458052 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458107 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458124 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458132 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458159 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458310 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458458 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458619 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458632 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458666 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458716 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458824 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458886 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459006 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459312 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459466 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459519 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459688 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459689 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459761 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459769 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459901 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459913 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459950 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459981 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460013 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460041 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460065 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460089 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460112 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460138 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460211 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460234 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460254 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460270 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460287 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460306 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460339 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460358 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460396 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460415 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460435 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460471 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460487 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460504 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460521 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460537 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460554 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460572 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460607 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460623 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460693 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460714 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460732 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460756 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461276 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461300 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461334 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461351 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461369 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461391 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461425 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461440 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461713 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461730 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461751 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461770 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461804 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461820 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461952 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461995 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462018 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462060 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460017 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460184 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460371 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.462160 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.962131115 +0000 UTC m=+19.883944249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466648 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466880 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466905 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.467063 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467073 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467124 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467126 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.467159 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.967137639 +0000 UTC m=+19.888950663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467325 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467391 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467435 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467471 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467625 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467646 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467671 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467685 4773 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467698 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467711 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467725 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467738 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467752 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467766 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467781 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467810 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467825 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467840 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467855 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467868 4773 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467882 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467895 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467911 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467924 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467966 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467979 4773 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467991 4773 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468004 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468017 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468029 4773 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468043 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468056 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468069 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468082 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468095 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468118 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468133 4773 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468145 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468158 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468171 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468184 4773 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468198 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468213 4773 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468228 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468241 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468256 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468269 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468279 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468333 4773 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468344 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468354 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468366 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468376 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468385 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.469189 4773 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457849 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467394 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467434 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462658 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462654 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462685 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463102 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463139 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463467 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463485 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463493 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463891 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464022 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.474727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464237 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464299 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464455 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464805 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465227 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465292 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465807 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465958 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476331 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467718 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468101 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468116 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468185 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462430 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468307 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468390 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468418 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468701 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468793 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.469384 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.471765 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.472029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.472598 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.472866 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.473290 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.473518 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.474737 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.475013 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.475289 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.475438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.475550 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476107 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476195 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477113 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477188 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478686 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477456 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478893 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.479163 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.479307 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.480016 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.482622 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.982588485 +0000 UTC m=+19.904401499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486011 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486048 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486154 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486475 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486578 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486654 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486798 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.986774252 +0000 UTC m=+19.908587496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486906 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.488347 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.488354 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.488809 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489068 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489545 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489788 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490019 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490187 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490211 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490246 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490316 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491129 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491173 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491186 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491259 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.991240366 +0000 UTC m=+19.913053390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.491760 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.492590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.492945 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493129 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493229 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.494220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.494395 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.496638 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.496941 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497191 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497594 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497629 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497904 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498231 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498293 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498473 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498540 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499030 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499565 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.501223 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.503889 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.504543 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.504600 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.505844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.506578 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507277 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507414 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507819 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507761 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507900 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508186 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508302 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508549 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508665 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508700 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.510870 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511146 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511422 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511540 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.512209 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.512285 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.512758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513690 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.514010 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.515463 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.515752 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.518590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.521650 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.522650 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.528023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.528379 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.528423 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.529402 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.533349 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.536705 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.538714 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.554146 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.555754 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.558434 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.567767 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f" exitCode=255 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.567823 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f"} Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569228 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569377 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569551 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569678 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569695 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569713 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569725 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569736 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569752 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569764 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569775 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569786 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569799 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569811 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569822 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569837 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569864 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569876 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569888 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569898 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569914 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569926 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569955 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569972 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569983 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569994 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570008 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570038 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570052 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570063 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570074 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570089 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570109 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570121 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570136 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570153 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570165 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570178 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570205 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570216 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570229 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570242 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570258 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570270 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570281 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570293 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570307 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570318 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570329 4773 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570346 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570370 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570382 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570394 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570409 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570420 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570431 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570442 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570456 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570467 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570479 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570494 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570506 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570517 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570529 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570547 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570558 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570570 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570583 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570598 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570610 4773 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570621 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570633 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570648 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570661 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570674 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570690 4773 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570702 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570715 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570729 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570746 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570760 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570781 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570794 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570811 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570823 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570837 4773 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570853 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570866 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570878 4773 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570889 4773 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570906 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570918 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570960 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570973 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570989 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571001 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571017 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571030 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571045 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571057 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571068 4773 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571083 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571095 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571107 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571118 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571134 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571145 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571158 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571169 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571184 4773 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571196 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571209 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571237 4773 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571250 4773 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571262 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571274 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571289 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571301 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571312 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571326 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571340 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571352 4773 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571369 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571381 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571396 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571409 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571421 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571437 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571452 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571464 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571476 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571491 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571503 4773 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571515 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571527 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571542 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571553 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571564 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571579 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571591 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571604 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571615 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571631 4773 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571646 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571659 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571672 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571688 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.594148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.610029 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.621158 4773 scope.go:117] "RemoveContainer" containerID="fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.628455 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.631327 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.631570 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gczfj"] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.632027 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.636650 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.636889 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.637050 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.656533 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.669212 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.674267 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4b9d\" (UniqueName: \"kubernetes.io/projected/357ca347-8fa9-4f0b-9f49-a540f14e0198-kube-api-access-h4b9d\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.674319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/357ca347-8fa9-4f0b-9f49-a540f14e0198-hosts-file\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.680053 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.684464 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.685265 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.689829 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.694267 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.707081 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e WatchSource:0}: Error finding container d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e: Status 404 returned error can't find the container with id d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.709130 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.712727 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720 WatchSource:0}: Error finding container 1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720: Status 404 returned error can't find the container with id 1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720 Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.717485 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5 WatchSource:0}: Error finding container c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5: Status 404 returned error can't find the container with id c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5 Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.720332 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Jan 20 18:30:26 crc kubenswrapper[4773]: set -o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: source /etc/kubernetes/apiserver-url.env Jan 20 18:30:26 crc kubenswrapper[4773]: else Jan 20 18:30:26 crc kubenswrapper[4773]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Jan 20 18:30:26 crc kubenswrapper[4773]: exit 1 Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.721652 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.726840 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.732609 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -f "/env/_master" ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: set -o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: source "/env/_master" Jan 20 18:30:26 crc kubenswrapper[4773]: set +o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Jan 20 18:30:26 crc kubenswrapper[4773]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Jan 20 18:30:26 crc kubenswrapper[4773]: ho_enable="--enable-hybrid-overlay" Jan 20 18:30:26 crc kubenswrapper[4773]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Jan 20 18:30:26 crc kubenswrapper[4773]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Jan 20 18:30:26 crc kubenswrapper[4773]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Jan 20 18:30:26 crc kubenswrapper[4773]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 20 18:30:26 crc kubenswrapper[4773]: --webhook-cert-dir="/etc/webhook-cert" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --webhook-host=127.0.0.1 \ Jan 20 18:30:26 crc kubenswrapper[4773]: --webhook-port=9743 \ Jan 20 18:30:26 crc kubenswrapper[4773]: ${ho_enable} \ Jan 20 18:30:26 crc kubenswrapper[4773]: --enable-interconnect \ Jan 20 18:30:26 crc kubenswrapper[4773]: --disable-approver \ Jan 20 18:30:26 crc kubenswrapper[4773]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --wait-for-kubernetes-api=200s \ Jan 20 18:30:26 crc kubenswrapper[4773]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --loglevel="${LOGLEVEL}" Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.732984 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.734043 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.739333 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -f "/env/_master" ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: set -o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: source "/env/_master" Jan 20 18:30:26 crc kubenswrapper[4773]: set +o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Jan 20 18:30:26 crc kubenswrapper[4773]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 20 18:30:26 crc kubenswrapper[4773]: --disable-webhook \ Jan 20 18:30:26 crc kubenswrapper[4773]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --loglevel="${LOGLEVEL}" Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.741200 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.747385 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.768995 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.776420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4b9d\" (UniqueName: \"kubernetes.io/projected/357ca347-8fa9-4f0b-9f49-a540f14e0198-kube-api-access-h4b9d\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.776467 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/357ca347-8fa9-4f0b-9f49-a540f14e0198-hosts-file\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.776545 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/357ca347-8fa9-4f0b-9f49-a540f14e0198-hosts-file\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.787701 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.801810 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.813123 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4b9d\" (UniqueName: \"kubernetes.io/projected/357ca347-8fa9-4f0b-9f49-a540f14e0198-kube-api-access-h4b9d\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.815960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.950118 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.961745 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357ca347_8fa9_4f0b_9f49_a540f14e0198.slice/crio-7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86 WatchSource:0}: Error finding container 7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86: Status 404 returned error can't find the container with id 7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86 Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.964063 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Jan 20 18:30:26 crc kubenswrapper[4773]: set -uo pipefail Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Jan 20 18:30:26 crc kubenswrapper[4773]: HOSTS_FILE="/etc/hosts" Jan 20 18:30:26 crc kubenswrapper[4773]: TEMP_FILE="/etc/hosts.tmp" Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: IFS=', ' read -r -a services <<< "${SERVICES}" Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # Make a temporary file with the old hosts file's attributes. Jan 20 18:30:26 crc kubenswrapper[4773]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Jan 20 18:30:26 crc kubenswrapper[4773]: echo "Failed to preserve hosts file. Exiting." Jan 20 18:30:26 crc kubenswrapper[4773]: exit 1 Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: while true; do Jan 20 18:30:26 crc kubenswrapper[4773]: declare -A svc_ips Jan 20 18:30:26 crc kubenswrapper[4773]: for svc in "${services[@]}"; do Jan 20 18:30:26 crc kubenswrapper[4773]: # Fetch service IP from cluster dns if present. We make several tries Jan 20 18:30:26 crc kubenswrapper[4773]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Jan 20 18:30:26 crc kubenswrapper[4773]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Jan 20 18:30:26 crc kubenswrapper[4773]: # support UDP loadbalancers and require reaching DNS through TCP. Jan 20 18:30:26 crc kubenswrapper[4773]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 20 18:30:26 crc kubenswrapper[4773]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 20 18:30:26 crc kubenswrapper[4773]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 20 18:30:26 crc kubenswrapper[4773]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Jan 20 18:30:26 crc kubenswrapper[4773]: for i in ${!cmds[*]} Jan 20 18:30:26 crc kubenswrapper[4773]: do Jan 20 18:30:26 crc kubenswrapper[4773]: ips=($(eval "${cmds[i]}")) Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: svc_ips["${svc}"]="${ips[@]}" Jan 20 18:30:26 crc kubenswrapper[4773]: break Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # Update /etc/hosts only if we get valid service IPs Jan 20 18:30:26 crc kubenswrapper[4773]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Jan 20 18:30:26 crc kubenswrapper[4773]: # Stale entries could exist in /etc/hosts if the service is deleted Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -n "${svc_ips[*]-}" ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Jan 20 18:30:26 crc kubenswrapper[4773]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Jan 20 18:30:26 crc kubenswrapper[4773]: # Only continue rebuilding the hosts entries if its original content is preserved Jan 20 18:30:26 crc kubenswrapper[4773]: sleep 60 & wait Jan 20 18:30:26 crc kubenswrapper[4773]: continue Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # Append resolver entries for services Jan 20 18:30:26 crc kubenswrapper[4773]: rc=0 Jan 20 18:30:26 crc kubenswrapper[4773]: for svc in "${!svc_ips[@]}"; do Jan 20 18:30:26 crc kubenswrapper[4773]: for ip in ${svc_ips[${svc}]}; do Jan 20 18:30:26 crc kubenswrapper[4773]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ $rc -ne 0 ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: sleep 60 & wait Jan 20 18:30:26 crc kubenswrapper[4773]: continue Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Jan 20 18:30:26 crc kubenswrapper[4773]: # Replace /etc/hosts with our modified version if needed Jan 20 18:30:26 crc kubenswrapper[4773]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Jan 20 18:30:26 crc kubenswrapper[4773]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: sleep 60 & wait Jan 20 18:30:26 crc kubenswrapper[4773]: unset svc_ips Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4b9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-gczfj_openshift-dns(357ca347-8fa9-4f0b-9f49-a540f14e0198): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.967033 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-gczfj" podUID="357ca347-8fa9-4f0b-9f49-a540f14e0198" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.978184 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.978260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.978306 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:27.978275981 +0000 UTC m=+20.900089015 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.978382 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.978437 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:27.978424635 +0000 UTC m=+20.900237659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.079558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.079614 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.079636 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079765 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079780 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079790 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079845 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:28.079817991 +0000 UTC m=+21.001631015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080437 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080529 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:28.080510228 +0000 UTC m=+21.002323252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080449 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080671 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080733 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080833 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:28.080816534 +0000 UTC m=+21.002629558 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.167332 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.255583 4773 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256011 4773 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256016 4773 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256067 4773 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256134 4773 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256159 4773 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256168 4773 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256223 4773 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256252 4773 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256254 4773 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256255 4773 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256272 4773 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256282 4773 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.257196 4773 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.267965 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.283334 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.284322 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-20 18:25:26 +0000 UTC, rotation deadline is 2026-10-15 22:00:25.3154403 +0000 UTC Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.284384 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6435h29m58.031058844s for next certificate rotation Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.294220 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.303458 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.313819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.326220 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.335290 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.346272 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.354573 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.367704 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 12:49:16.588012406 +0000 UTC Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.371972 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kjbfj"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.372619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373020 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-sq4x7"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373305 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bccxn"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373502 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.374503 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.374735 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.374742 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.375489 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.375736 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376145 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376202 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376290 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376635 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376753 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.377303 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.378333 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.379022 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.381215 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383182 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383569 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383700 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383798 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.384126 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.386573 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.390439 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.406537 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.415699 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.424424 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.435114 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.443145 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.446295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.446547 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.450561 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.451105 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.451950 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.452594 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.453152 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.453663 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.455220 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.455566 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.455739 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.456891 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.457497 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.458514 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.459291 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.459833 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.461348 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.461836 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.462677 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.463367 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.463725 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.464701 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.465276 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.466089 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.466683 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.467129 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.468075 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.468112 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.468815 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.470040 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.470756 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.471817 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.472483 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.473452 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.474016 4773 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.474136 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.476453 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.476909 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.477364 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.479447 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.479570 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.481005 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.481628 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.482892 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483658 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483884 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-netns\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483995 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484045 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484071 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-cni-binary-copy\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-system-cni-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484114 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-os-release\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484159 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ddd934f-f012-4083-b5e6-b99711071621-rootfs\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484181 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-multus-certs\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-os-release\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484226 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-k8s-cni-cncf-io\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484247 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-kubelet\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484267 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64lq8\" (UniqueName: \"kubernetes.io/projected/1ddd934f-f012-4083-b5e6-b99711071621-kube-api-access-64lq8\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484310 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484368 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7r5p\" (UniqueName: \"kubernetes.io/projected/7ddd5104-3112-413e-b908-2b7f336b41f1-kube-api-access-q7r5p\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-system-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484458 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ddd934f-f012-4083-b5e6-b99711071621-proxy-tls\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-cnibin\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484503 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484524 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-etc-kubernetes\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484564 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-bin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-hostroot\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484660 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484703 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-cnibin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484726 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-multus\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484768 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ddd934f-f012-4083-b5e6-b99711071621-mcd-auth-proxy-config\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484813 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-socket-dir-parent\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484834 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtw7\" (UniqueName: \"kubernetes.io/projected/061a607e-1868-4fcf-b3ea-d51157511d41-kube-api-access-mwtw7\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484883 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-conf-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484958 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484981 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.485000 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.485020 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-multus-daemon-config\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.486426 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.487162 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.488490 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.489865 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.490396 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.491594 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.491942 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.492487 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.493911 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.494542 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.495119 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.496187 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.496898 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.498080 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.498623 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.502410 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.511747 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.520460 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.531375 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.540396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.552277 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.561027 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.570717 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.571769 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.573679 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.577678 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.577972 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.578184 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.581580 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gczfj" event={"ID":"357ca347-8fa9-4f0b-9f49-a540f14e0198","Type":"ContainerStarted","Data":"7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.582514 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585698 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7r5p\" (UniqueName: \"kubernetes.io/projected/7ddd5104-3112-413e-b908-2b7f336b41f1-kube-api-access-q7r5p\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585729 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585747 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-system-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ddd934f-f012-4083-b5e6-b99711071621-proxy-tls\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585807 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-cnibin\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585823 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585837 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585852 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-etc-kubernetes\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585867 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585881 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-bin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-hostroot\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585958 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585976 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-cnibin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-multus\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586008 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ddd934f-f012-4083-b5e6-b99711071621-mcd-auth-proxy-config\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586040 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-socket-dir-parent\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586068 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtw7\" (UniqueName: \"kubernetes.io/projected/061a607e-1868-4fcf-b3ea-d51157511d41-kube-api-access-mwtw7\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-conf-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593715 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593738 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593760 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593784 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-multus-daemon-config\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593862 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593888 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-netns\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593914 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593964 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-cni-binary-copy\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-os-release\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594065 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ddd934f-f012-4083-b5e6-b99711071621-rootfs\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-system-cni-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594135 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-os-release\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594157 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-k8s-cni-cncf-io\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594178 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-kubelet\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594199 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-multus-certs\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594223 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594265 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64lq8\" (UniqueName: \"kubernetes.io/projected/1ddd934f-f012-4083-b5e6-b99711071621-kube-api-access-64lq8\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.587990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588002 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-cnibin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588228 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-conf-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588257 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588946 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588870 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595023 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-kubelet\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-os-release\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595125 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-os-release\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.589291 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ddd934f-f012-4083-b5e6-b99711071621-mcd-auth-proxy-config\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595180 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-k8s-cni-cncf-io\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588749 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-hostroot\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-bin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588301 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-system-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595603 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588340 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-cni-binary-copy\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595681 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-multus\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595714 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-netns\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588786 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-cnibin\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595743 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ddd934f-f012-4083-b5e6-b99711071621-rootfs\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-socket-dir-parent\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595786 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-system-cni-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588714 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595807 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588844 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-etc-kubernetes\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588818 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595850 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-multus-certs\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595915 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.596204 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-multus-daemon-config\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.596392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.596442 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.597531 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.598055 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ddd934f-f012-4083-b5e6-b99711071621-proxy-tls\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.599685 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.606736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.610655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7r5p\" (UniqueName: \"kubernetes.io/projected/7ddd5104-3112-413e-b908-2b7f336b41f1-kube-api-access-q7r5p\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.611394 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.612384 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtw7\" (UniqueName: \"kubernetes.io/projected/061a607e-1868-4fcf-b3ea-d51157511d41-kube-api-access-mwtw7\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.616919 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.622050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64lq8\" (UniqueName: \"kubernetes.io/projected/1ddd934f-f012-4083-b5e6-b99711071621-kube-api-access-64lq8\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.630688 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.651177 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.667431 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.677464 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.685847 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.689998 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.689983 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.697908 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.699147 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.709342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.732353 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.745011 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddd934f_f012_4083_b5e6_b99711071621.slice/crio-a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70 WatchSource:0}: Error finding container a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70: Status 404 returned error can't find the container with id a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70 Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.771370 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.792314 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.811808 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.821466 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.833921 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.844819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.845891 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.850145 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.859697 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.860397 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.885986 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.922188 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.959547 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:27.999997 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.000108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.000176 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.000151193 +0000 UTC m=+22.921964237 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.000209 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.000273 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.000256485 +0000 UTC m=+22.922069559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.000433 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.039084 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.086338 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.101124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.101183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.101211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101326 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101339 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101391 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101405 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101420 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.101398477 +0000 UTC m=+23.023211501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101339 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101472 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101479 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.101462358 +0000 UTC m=+23.023275442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101484 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101525 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.10151389 +0000 UTC m=+23.023326984 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.123300 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.161757 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.174978 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.214053 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.233919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.267988 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.274476 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.323166 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.368213 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 06:56:51.38414991 +0000 UTC Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.369759 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.394332 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.425838 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.446256 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.446288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.446436 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.446519 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.462660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.503785 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.514081 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.574072 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.605702 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.610841 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3" exitCode=0 Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.610965 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.610997 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"410001a1d3881fa68033cb522fb1036ff5be18d13872f61c3fe53b410c458aa8"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.612843 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.612900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.612914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.615367 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745" exitCode=0 Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.615431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.615454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerStarted","Data":"6fcca40afa0a2739c1fcbd5cffd85ebe7836e640caaaf2f50d873e009384d4a3"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.620560 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.620598 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.622839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.622904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"375987ce103d4dda6ae8622d9203ab8286d343e5a762d0690e4f438712cdf1f0"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.625317 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gczfj" event={"ID":"357ca347-8fa9-4f0b-9f49-a540f14e0198","Type":"ContainerStarted","Data":"48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.626977 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.648676 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.674188 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.694879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.733899 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.738007 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.782406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.813891 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.854545 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.866061 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.874806 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.924269 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.962309 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.004620 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.045397 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.086690 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.126551 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.182446 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.206123 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.245983 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.277671 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5sv79"] Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.278108 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.285344 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.296351 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.314241 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.319404 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-serviceca\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.319452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dls8p\" (UniqueName: \"kubernetes.io/projected/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-kube-api-access-dls8p\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.319485 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-host\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.334557 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.353257 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.368331 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:47:10.49736871 +0000 UTC Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.405958 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dls8p\" (UniqueName: \"kubernetes.io/projected/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-kube-api-access-dls8p\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420266 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-host\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-serviceca\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-host\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.421280 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-serviceca\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.443608 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.446828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:29 crc kubenswrapper[4773]: E0120 18:30:29.446956 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.471515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dls8p\" (UniqueName: \"kubernetes.io/projected/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-kube-api-access-dls8p\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.503105 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.544131 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.590216 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.607900 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.623888 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633707 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633715 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633723 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.635488 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e" exitCode=0 Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.635543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.662226 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: W0120 18:30:29.690033 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a565d2f_43a1_41f5_b7a6_85d7d0aea0a7.slice/crio-08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d WatchSource:0}: Error finding container 08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d: Status 404 returned error can't find the container with id 08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.702650 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.750190 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.782178 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.827607 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.866186 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.887564 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.900769 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.906029 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.927443 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.963394 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.028152 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.028249 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.028233087 +0000 UTC m=+26.950046111 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.028378 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.028473 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.028517 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.028505663 +0000 UTC m=+26.950318687 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.030369 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.063370 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.083925 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.124482 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.128950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.128999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.129025 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129146 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129170 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129192 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129190 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129206 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129217 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.129199204 +0000 UTC m=+27.051012228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129221 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129238 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129258 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.129241995 +0000 UTC m=+27.051055059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129286 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.129271545 +0000 UTC m=+27.051084569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.161449 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.202250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.240681 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.285472 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.322660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.361689 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.368788 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 14:31:27.096957975 +0000 UTC Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.400886 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.441897 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.446983 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.447090 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.447120 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.447291 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.484229 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.527864 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.564200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.603243 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.639923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5sv79" event={"ID":"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7","Type":"ContainerStarted","Data":"d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.639995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5sv79" event={"ID":"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7","Type":"ContainerStarted","Data":"08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.641610 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.643735 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f" exitCode=0 Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.643776 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.645267 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.651077 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.685648 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.724016 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.772672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.807252 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.844534 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.883155 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.924872 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.966755 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.006756 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.043036 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.081988 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.129795 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.168148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.222084 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.244176 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.283672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.322293 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.364158 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.369276 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 23:03:17.68404031 +0000 UTC Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.405034 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.445695 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.446082 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:31 crc kubenswrapper[4773]: E0120 18:30:31.446240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.655166 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2" exitCode=0 Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.655216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2"} Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.670565 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.682698 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.701780 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.722423 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.747322 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.770249 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.785955 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.797525 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.807909 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.842034 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.886883 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.921623 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.964843 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.001794 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.040577 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.370400 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 07:01:28.984350076 +0000 UTC Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.446084 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.446226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.446249 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.446472 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.656306 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658299 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.661221 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce" exitCode=0 Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.661292 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.666579 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.669912 4773 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.670655 4773 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.671766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.671868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.671961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.672040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.672110 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.678559 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.686452 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.689975 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690069 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.692828 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.703328 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.706633 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707469 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.720697 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723964 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723973 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.735171 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.738660 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.749493 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.752216 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.755574 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758218 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.763146 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.776827 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.789444 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.798264 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.806744 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.820742 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.833908 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.852562 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860429 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.870782 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.886140 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963574 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066342 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168715 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168724 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271138 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.370777 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 16:23:28.091367264 +0000 UTC Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373844 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.446438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:33 crc kubenswrapper[4773]: E0120 18:30:33.446690 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.477280 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580471 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682235 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157" exitCode=0 Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682925 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.709208 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.726882 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.744304 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.758992 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.777385 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786313 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.794304 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.820447 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.839232 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.855218 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.872843 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.890426 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898473 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898564 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.909059 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.931612 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.948210 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.965155 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001828 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.072626 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.072834 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.072974 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.072891229 +0000 UTC m=+34.994704253 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.073045 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.073121 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.073109184 +0000 UTC m=+34.994922208 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105082 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105182 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.174108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.174182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.174220 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174347 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174406 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174432 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174449 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174465 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174516 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174538 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174478 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.174450119 +0000 UTC m=+35.096263153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174601 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.174586472 +0000 UTC m=+35.096399516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174624 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.174612133 +0000 UTC m=+35.096425177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209278 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.311968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312090 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.371031 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 16:16:02.595114926 +0000 UTC Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416648 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.446709 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.446918 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.447595 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.447756 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519779 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.626911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.626980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.627000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.627024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.627041 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.689818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.690201 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.693689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerStarted","Data":"52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.710196 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.715142 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.728844 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729865 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.743656 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.761317 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.775858 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.792746 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.808900 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.823077 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.832804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.832963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.833037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.833127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.833198 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.836856 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.852548 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.872809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.890048 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.906237 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.925779 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.935966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936052 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.942872 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.962320 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.977132 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.990878 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.007831 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.024280 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.036754 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038884 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.058496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.072425 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.088183 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.104504 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.116117 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.125675 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.135743 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140755 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.151441 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.170417 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.242956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243058 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345662 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.371539 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:42:27.231848579 +0000 UTC Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.446811 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:35 crc kubenswrapper[4773]: E0120 18:30:35.446996 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448596 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551292 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.696105 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.696280 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.721713 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.734042 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.745353 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.758310 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759586 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759696 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.780257 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.795711 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.807731 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.822194 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.835563 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.845993 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.855536 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862345 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.872405 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.885348 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.910357 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.926567 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.941784 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965161 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067881 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.169984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170076 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272879 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.372815 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 09:44:01.416023939 +0000 UTC Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.375894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376259 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.446410 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.446893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:36 crc kubenswrapper[4773]: E0120 18:30:36.447102 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:36 crc kubenswrapper[4773]: E0120 18:30:36.447330 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.478893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479737 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583816 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687206 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.699200 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789767 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.892990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893058 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.965706 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996429 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099654 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203304 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306115 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.373046 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:12:21.008170054 +0000 UTC Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409962 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.446749 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:37 crc kubenswrapper[4773]: E0120 18:30:37.446907 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.463385 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.475879 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.492299 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.511960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512640 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.543617 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.563465 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.582570 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.606495 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615182 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.625181 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.649721 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.668455 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.701715 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.704570 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/0.log" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.708708 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85" exitCode=1 Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.708753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.709434 4773 scope.go:117] "RemoveContainer" containerID="704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721585 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.722867 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.740048 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.759803 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.782334 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.797815 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.815571 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829635 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.834818 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.850903 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.864688 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.880104 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.894283 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.918911 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933165 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933238 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933274 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.935883 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.951408 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.964557 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.978275 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.991908 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.002891 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036997 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242731 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.344956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345044 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.373867 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:58:03.898779605 +0000 UTC Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.446060 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:38 crc kubenswrapper[4773]: E0120 18:30:38.446232 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.446284 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:38 crc kubenswrapper[4773]: E0120 18:30:38.446428 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447934 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550596 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652641 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.713928 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/0.log" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.721702 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.721843 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.740399 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754934 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754981 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.757703 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.774781 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.789175 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.817074 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.831406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.843746 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857079 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857702 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.869426 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.883634 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.898113 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.911433 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.931591 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.948761 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960703 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.966250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.014033 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.032823 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063821 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.066096 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.081861 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.095681 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.114396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.136219 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.155667 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.168982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169118 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.170895 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.190994 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.206349 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.221666 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.244725 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.263974 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272972 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.280809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.294607 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.374066 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 18:40:38.021838154 +0000 UTC Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.378917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.378979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.378989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.379010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.379025 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.446331 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:39 crc kubenswrapper[4773]: E0120 18:30:39.446674 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483404 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483484 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483558 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587235 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587259 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.690972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691067 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.730281 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.731580 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/0.log" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.735822 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" exitCode=1 Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.735878 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.736008 4773 scope.go:117] "RemoveContainer" containerID="704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.737473 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:39 crc kubenswrapper[4773]: E0120 18:30:39.737846 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.761305 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.777210 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794974 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.795877 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.813645 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.840795 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.866631 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.885305 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898273 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.908121 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.922121 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.934254 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.945566 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.955620 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.974818 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.991079 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001691 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.003432 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104897 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.203082 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k"] Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.203764 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.207064 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.207658 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.209437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210177 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.237702 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.256151 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.271328 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.289065 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.307618 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.312976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313148 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.320496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.334148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.362845 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363589 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363657 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363706 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7821f5e-4734-489f-bcf9-910b875a4848-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363751 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcp99\" (UniqueName: \"kubernetes.io/projected/d7821f5e-4734-489f-bcf9-910b875a4848-kube-api-access-lcp99\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.374756 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 18:34:35.743154853 +0000 UTC Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.377799 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.391001 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.404786 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416099 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.417242 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.429445 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.440839 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.446063 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.446075 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.446158 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.446393 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.457416 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465303 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7821f5e-4734-489f-bcf9-910b875a4848-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465486 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcp99\" (UniqueName: \"kubernetes.io/projected/d7821f5e-4734-489f-bcf9-910b875a4848-kube-api-access-lcp99\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.466353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.466673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.469663 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.479474 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7821f5e-4734-489f-bcf9-910b875a4848-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.488375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcp99\" (UniqueName: \"kubernetes.io/projected/d7821f5e-4734-489f-bcf9-910b875a4848-kube-api-access-lcp99\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519341 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.526506 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: W0120 18:30:40.544976 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7821f5e_4734_489f_bcf9_910b875a4848.slice/crio-204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98 WatchSource:0}: Error finding container 204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98: Status 404 returned error can't find the container with id 204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98 Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623405 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725368 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.739694 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" event={"ID":"d7821f5e-4734-489f-bcf9-910b875a4848","Type":"ContainerStarted","Data":"204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.743221 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.746660 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.746829 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.758925 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.770298 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.782490 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.794469 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.803217 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.812910 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.824104 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827928 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827969 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.846273 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.861760 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.873445 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.892189 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.913174 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.931815 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932937 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932977 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932993 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.945173 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4jpbd"] Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.945615 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.945678 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.949851 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.962452 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.992747 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.008958 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.026607 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.045308 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.065190 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.072251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.072309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lgr\" (UniqueName: \"kubernetes.io/projected/3791c4b7-dcef-470d-a67e-a2c0bb004436-kube-api-access-66lgr\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.086521 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.119534 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138849 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.142116 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.154647 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.171051 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.173778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.173989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lgr\" (UniqueName: \"kubernetes.io/projected/3791c4b7-dcef-470d-a67e-a2c0bb004436-kube-api-access-66lgr\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.174040 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.174336 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:41.674303343 +0000 UTC m=+34.596116377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.190217 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.198477 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lgr\" (UniqueName: \"kubernetes.io/projected/3791c4b7-dcef-470d-a67e-a2c0bb004436-kube-api-access-66lgr\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.210643 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.224664 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.240858 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242193 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242217 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.255543 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.271893 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.291157 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.306855 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346554 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.374906 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:41:27.866268921 +0000 UTC Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.446634 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.446867 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449702 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552756 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655115 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.680914 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.681145 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.681211 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.681195446 +0000 UTC m=+35.603008480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.753574 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" event={"ID":"d7821f5e-4734-489f-bcf9-910b875a4848","Type":"ContainerStarted","Data":"f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.753869 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" event={"ID":"d7821f5e-4734-489f-bcf9-910b875a4848","Type":"ContainerStarted","Data":"f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757720 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.770357 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.784331 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.800153 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.816713 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.831088 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.843132 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860636 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.865156 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.888077 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.901319 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.927573 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.943043 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.957799 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963318 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.972492 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.989517 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.002501 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.019448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.030479 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066718 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.086011 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.086235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.086329 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.086269912 +0000 UTC m=+51.008082976 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.086442 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.086647 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.08662947 +0000 UTC m=+51.008442534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.183008 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.187492 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.187544 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.187570 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187709 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187716 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187733 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187744 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187752 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187758 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187805 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.187788932 +0000 UTC m=+51.109601976 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187829 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.187817872 +0000 UTC m=+51.109630906 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187832 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187879 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.187860103 +0000 UTC m=+51.109673137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286289 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286320 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.376208 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 19:40:30.220960774 +0000 UTC Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390527 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.446959 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.447076 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.447182 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.447076 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.447247 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.447374 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494283 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598483 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.695092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.695347 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.695497 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:44.695463053 +0000 UTC m=+37.617276107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703701 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808832 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.912882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.912978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.913001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.913027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.913047 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935659 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.957423 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963529 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.989618 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994733 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994747 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.011802 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:43Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017538 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.036797 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:43Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042529 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.064678 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:43Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.064806 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066810 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170240 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170327 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273355 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376177 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376361 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:16:13.469981906 +0000 UTC Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.446167 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.446316 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479808 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583720 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583741 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687625 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790902 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894507 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894562 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997867 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101342 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205332 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.309972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310087 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.376977 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:52:24.742014989 +0000 UTC Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413591 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.446277 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.446354 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.446316 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.446500 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.446609 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.446749 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516885 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516896 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.623983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624076 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.718305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.718630 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.718768 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:48.718733475 +0000 UTC m=+41.640546539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727697 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830938 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934295 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037582 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141326 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244843 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347600 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.377156 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 22:41:43.738132095 +0000 UTC Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.446225 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:45 crc kubenswrapper[4773]: E0120 18:30:45.446460 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450592 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.656998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657106 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760365 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863632 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967432 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.070999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071091 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174543 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278306 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278446 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278473 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.377844 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 11:48:07.417897546 +0000 UTC Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381591 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381630 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.446635 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.446672 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.446770 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:46 crc kubenswrapper[4773]: E0120 18:30:46.447005 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:46 crc kubenswrapper[4773]: E0120 18:30:46.447094 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:46 crc kubenswrapper[4773]: E0120 18:30:46.447184 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485652 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588268 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691353 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795414 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898736 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001813 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104859 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311754 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.378068 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:50:42.576655323 +0000 UTC Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414309 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.447040 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:47 crc kubenswrapper[4773]: E0120 18:30:47.447207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.474442 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.497884 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.517154 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.517971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518129 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.554419 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.570881 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.588076 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.605792 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.620336 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621491 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621567 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.631709 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.653138 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.666628 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.679307 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.697773 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.712548 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724532 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724981 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.729579 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.744195 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.754898 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827809 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931379 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034477 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034556 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240728 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344793 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.378238 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 21:21:57.672306348 +0000 UTC Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.446037 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.446076 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.446091 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.446288 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.446434 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.446567 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448327 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448340 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.550916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551104 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654687 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.767032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.767175 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.767261 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:56.767236177 +0000 UTC m=+49.689049241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861918 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964727 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068412 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171184 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274243 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377468 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.378725 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 04:46:06.925714397 +0000 UTC Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.446338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:49 crc kubenswrapper[4773]: E0120 18:30:49.446526 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480726 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480772 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584372 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688390 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791590 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896254 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.998908 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.998985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.999003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.999024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.999041 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206461 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206501 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309428 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.341606 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.342868 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.379158 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 09:09:54.658737128 +0000 UTC Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.446715 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.446753 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.446799 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:50 crc kubenswrapper[4773]: E0120 18:30:50.446997 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:50 crc kubenswrapper[4773]: E0120 18:30:50.447185 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:50 crc kubenswrapper[4773]: E0120 18:30:50.447341 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518141 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518248 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622265 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726116 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.787911 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.790632 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.791533 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.806605 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.819619 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828838 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.840123 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.854188 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.866700 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.878968 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.907976 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.928232 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.930952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.930986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.930997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.931011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.931020 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.947767 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.977164 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.989781 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.001881 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.014118 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.028092 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.038262 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.057097 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.068336 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135549 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135672 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238403 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341260 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.379729 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 03:53:48.388164588 +0000 UTC Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.443897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.443985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.443998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.444014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.444024 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.446432 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:51 crc kubenswrapper[4773]: E0120 18:30:51.446544 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.547027 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649390 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752731 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.797113 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.798139 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.802130 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" exitCode=1 Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.802178 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.802217 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.803258 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:30:51 crc kubenswrapper[4773]: E0120 18:30:51.803556 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.818600 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.838313 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855918 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.856025 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.864179 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.886161 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.914738 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.932431 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.953063 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958948 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.978692 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.004221 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.023532 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.041364 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062255 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.068137 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.083973 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.102271 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.117564 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.133267 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.150201 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165163 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268356 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371997 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.380123 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 13:23:42.791905919 +0000 UTC Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.446538 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.446661 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.446559 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.446729 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.446538 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.447080 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475104 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475258 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578272 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680701 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.807763 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.811661 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.811811 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.841730 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.858960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.874562 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.898696 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.922617 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.935659 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.953102 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.967875 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988768 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.998284 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.016839 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.037734 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.059045 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.077583 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090886 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090960 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.098170 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.109134 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.120112 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.135889 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193686 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.307807 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311144 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.323716 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.340859 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343694 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.354304 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357369 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.371147 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.371268 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372811 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.380980 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:01:13.589640059 +0000 UTC Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.446603 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.446823 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475433 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475470 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577758 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679786 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781727 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781768 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884851 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090213 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296091 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.382021 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 20:54:49.631810337 +0000 UTC Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398904 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398956 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.446553 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:54 crc kubenswrapper[4773]: E0120 18:30:54.446684 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.447063 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:54 crc kubenswrapper[4773]: E0120 18:30:54.447110 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.446504 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:54 crc kubenswrapper[4773]: E0120 18:30:54.447227 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501151 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603654 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810275 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016399 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118907 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221851 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325564 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.382821 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:09:05.717255496 +0000 UTC Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.427978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428049 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.446947 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:55 crc kubenswrapper[4773]: E0120 18:30:55.447083 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634403 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.736825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.736917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.736973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.737010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.737034 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839928 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942561 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.047063 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.151017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254718 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357720 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357744 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.383259 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 17:38:19.362707641 +0000 UTC Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.446972 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.447013 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.447207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.447325 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.447613 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.447841 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460747 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667994 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770919 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770985 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.855398 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.855585 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.855638 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:12.855625778 +0000 UTC m=+65.777438802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873896 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977214 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080267 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184245 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.286983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287061 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.383947 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 04:22:37.333042535 +0000 UTC Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390704 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390725 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.447124 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:57 crc kubenswrapper[4773]: E0120 18:30:57.447316 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.465312 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.480038 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493852 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.509443 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.528623 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.550735 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.571287 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.594172 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597113 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.613396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.630077 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.663493 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.681964 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700567 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.705881 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.723208 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.740912 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.765884 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.796776 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803662 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.818989 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906280 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009988 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112788 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.170694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.170885 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.171171 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.171316 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.171279992 +0000 UTC m=+83.093093046 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.171810 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.171784123 +0000 UTC m=+83.093597207 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216911 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.272664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.272755 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.272797 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273109 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273140 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273172 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273211 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273275 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.273237742 +0000 UTC m=+83.195050806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273131 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273306 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.273292853 +0000 UTC m=+83.195105917 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273323 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273347 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273431 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.273403696 +0000 UTC m=+83.195216770 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.320147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.320450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.320684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.321160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.321433 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.384624 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 01:24:52.751688167 +0000 UTC Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.424177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.424612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.424849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.425116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.425340 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.446630 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.446757 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.446856 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.447018 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.447194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.447446 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528374 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632392 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736638 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840518 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840623 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840644 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.944281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.944664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.944921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.945155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.945282 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058426 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162187 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266261 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370720 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.386142 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:53:13.519491254 +0000 UTC Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.446364 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:59 crc kubenswrapper[4773]: E0120 18:30:59.446499 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474343 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.578000 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681423 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.784717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785625 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.818879 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.838729 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.847787 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.871150 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.883675 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.897903 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.920913 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.937393 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.949672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.971637 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.982792 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.990987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991068 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.993102 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.004207 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.016023 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.025879 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.043578 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.053359 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.067055 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.079679 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094655 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198857 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.301525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.301860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.301974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.302076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.302170 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.386995 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 20:46:12.428713536 +0000 UTC Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404843 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404883 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.446915 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.446993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.447089 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:00 crc kubenswrapper[4773]: E0120 18:31:00.447858 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:00 crc kubenswrapper[4773]: E0120 18:31:00.447615 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:00 crc kubenswrapper[4773]: E0120 18:31:00.448032 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.507558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508502 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612371 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715780 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819835 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.923764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.924543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.924755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.924909 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.925127 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028203 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132277 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.236355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237141 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340196 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.387697 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:09:02.480745908 +0000 UTC Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443446 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443528 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.445951 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:01 crc kubenswrapper[4773]: E0120 18:31:01.446082 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546330 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649728 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753425 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753441 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.856911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857982 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.961854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.961976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.961999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.962028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.962047 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065173 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169189 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274329 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378327 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.388150 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 01:21:41.316565099 +0000 UTC Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.447340 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.447409 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:02 crc kubenswrapper[4773]: E0120 18:31:02.447501 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.447335 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:02 crc kubenswrapper[4773]: E0120 18:31:02.447686 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:02 crc kubenswrapper[4773]: E0120 18:31:02.447754 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482298 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586368 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.689840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.689901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.689985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.690025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.690048 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793193 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793204 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896577 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000339 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103605 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103696 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206851 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309716 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.388495 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 14:20:50.957193147 +0000 UTC Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.412986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413126 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.446873 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.447183 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516887 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620488 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661680 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.680293 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685692 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.702044 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.706451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.706732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.706865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.707034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.707170 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.728596 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734992 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.756916 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.762991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763592 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.786440 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.787279 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.789877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.789970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.789986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.790007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.790022 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893704 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893715 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893766 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996912 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099551 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099575 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203521 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203540 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307630 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.389569 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 06:19:07.282423754 +0000 UTC Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412917 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.446264 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.446596 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:04 crc kubenswrapper[4773]: E0120 18:31:04.446762 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.446887 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:04 crc kubenswrapper[4773]: E0120 18:31:04.447207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:04 crc kubenswrapper[4773]: E0120 18:31:04.446975 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517328 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.620484 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621383 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724235 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931170 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034296 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138777 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241759 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345413 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.390550 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 12:28:58.994119857 +0000 UTC Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.446369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:05 crc kubenswrapper[4773]: E0120 18:31:05.446784 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.448160 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:31:05 crc kubenswrapper[4773]: E0120 18:31:05.448625 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450688 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.554811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.554902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.554972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.555052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.555082 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659673 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.764007 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.867871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.867982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.868000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.868032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.868049 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075571 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075633 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179439 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179499 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283574 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.386384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387680 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.391533 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:17:33.01024901 +0000 UTC Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.446552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.446552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.446590 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:06 crc kubenswrapper[4773]: E0120 18:31:06.447469 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:06 crc kubenswrapper[4773]: E0120 18:31:06.447631 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:06 crc kubenswrapper[4773]: E0120 18:31:06.447926 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597185 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597337 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.702023 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.805715 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806381 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909723 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909804 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013439 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013497 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221781 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325477 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325530 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.391652 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 05:05:58.576393588 +0000 UTC Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.428539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.428987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.429245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.429432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.429652 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.446283 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:07 crc kubenswrapper[4773]: E0120 18:31:07.446539 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.472155 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.494682 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.511706 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.533995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534058 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534132 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534714 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.550542 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.567965 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.584521 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.601332 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.617790 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.639010 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645731 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.659740 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.682161 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.696878 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.718619 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.734367 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.748898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749420 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.750149 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.765759 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.777960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851694 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953887 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364970 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.392209 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:31:14.940068406 +0000 UTC Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.446263 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.446322 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:08 crc kubenswrapper[4773]: E0120 18:31:08.446430 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.446455 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:08 crc kubenswrapper[4773]: E0120 18:31:08.446600 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:08 crc kubenswrapper[4773]: E0120 18:31:08.446730 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468082 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468165 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468193 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674315 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777591 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.983921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.983979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.983990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.984005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.984016 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087537 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191318 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295185 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295282 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.392677 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 05:40:33.781169559 +0000 UTC Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398303 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.448578 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:09 crc kubenswrapper[4773]: E0120 18:31:09.449023 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500539 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.602998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603058 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706778 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.809856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.809978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.810000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.810031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.810053 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914347 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120897 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224196 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327610 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.393127 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 16:48:34.421596952 +0000 UTC Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429541 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.446651 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.446692 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.446830 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:10 crc kubenswrapper[4773]: E0120 18:31:10.446885 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:10 crc kubenswrapper[4773]: E0120 18:31:10.446978 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:10 crc kubenswrapper[4773]: E0120 18:31:10.447186 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532904 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.635896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636316 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739165 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739202 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841618 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944109 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.046986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047090 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149449 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149527 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252533 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356489 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.393394 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:43:31.002488579 +0000 UTC Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.446364 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:11 crc kubenswrapper[4773]: E0120 18:31:11.446527 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458827 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.561973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562062 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666319 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769271 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872468 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975352 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077852 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077942 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.180949 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181067 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291243 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393232 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393638 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 14:32:01.117762962 +0000 UTC Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.446715 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.446737 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.446836 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.447051 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.447192 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.447234 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495979 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598741 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700631 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803423 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.860036 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.860265 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.860385 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:44.860359299 +0000 UTC m=+97.782172513 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.905958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906073 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111769 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214586 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316838 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.393819 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 16:19:47.643263138 +0000 UTC Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.446543 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:13 crc kubenswrapper[4773]: E0120 18:31:13.446801 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523176 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.625917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626059 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728909 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832368 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.927553 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/0.log" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.927615 4773 generic.go:334] "Generic (PLEG): container finished" podID="061a607e-1868-4fcf-b3ea-d51157511d41" containerID="5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5" exitCode=1 Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.927653 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerDied","Data":"5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.928248 4773 scope.go:117] "RemoveContainer" containerID="5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934808 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.943668 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.959891 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.970892 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.992051 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.004326 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006728 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006772 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006787 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.020525 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.020576 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025441 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.039281 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.043209 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048494 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.055435 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.068299 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.070739 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073454 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073492 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.082920 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.085605 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090244 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.098200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.103147 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.103282 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105217 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.111410 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.127115 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.151686 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.168593 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.187582 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208147 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.210264 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.229735 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311677 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.395022 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:02:51.442982845 +0000 UTC Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414267 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.446920 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.447080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.446973 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.447283 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.447494 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.447633 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517683 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620652 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725283 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833188 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935991 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.936023 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/0.log" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.936106 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.957074 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.972285 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.984613 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.013311 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.026241 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040014 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040714 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.053712 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.072876 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.085583 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.095996 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.106243 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.118835 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.130098 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142953 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.151339 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.165193 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.175967 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.189578 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.201133 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.244987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347162 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347222 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.395612 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 15:56:55.586753958 +0000 UTC Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.447122 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:15 crc kubenswrapper[4773]: E0120 18:31:15.447255 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448846 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652836 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754788 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856902 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959397 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.061980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062059 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164311 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266911 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.369974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370026 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370053 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.395826 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:32:33.366681482 +0000 UTC Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.446110 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.446161 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.446171 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:16 crc kubenswrapper[4773]: E0120 18:31:16.446276 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:16 crc kubenswrapper[4773]: E0120 18:31:16.446377 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:16 crc kubenswrapper[4773]: E0120 18:31:16.446487 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473836 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575863 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678304 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781114 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883534 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883637 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059891 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162566 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264640 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367615 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.396963 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:23:26.058207323 +0000 UTC Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.446661 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:17 crc kubenswrapper[4773]: E0120 18:31:17.447140 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.447416 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.459727 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.476023 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.490853 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.507247 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.522623 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.534645 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.553090 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.580542 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.596251 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.611279 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.626708 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.645095 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673632 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673956 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.713367 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.731496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.748075 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.766880 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776891 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.777490 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879979 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983486 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.058542 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.061165 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.061569 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.079214 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086793 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.092448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.108532 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.121085 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.131296 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.142258 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.163222 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.182742 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189383 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.196363 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.209184 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.264656 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.278026 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.289976 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291580 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.305455 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.318837 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.338116 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.352788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.365786 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.397342 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 15:26:34.603319382 +0000 UTC Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.446869 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.446966 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:18 crc kubenswrapper[4773]: E0120 18:31:18.447158 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.447192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:18 crc kubenswrapper[4773]: E0120 18:31:18.447350 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:18 crc kubenswrapper[4773]: E0120 18:31:18.447429 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497284 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599168 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701790 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907395 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010126 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.066149 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.067128 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.069350 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" exitCode=1 Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.069383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.069416 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.070260 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:19 crc kubenswrapper[4773]: E0120 18:31:19.070461 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.085198 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.099448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113388 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.115829 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.129232 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.140044 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.149040 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.160501 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.171784 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.191258 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.203772 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.215986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216036 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.217218 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.232346 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.248027 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.261363 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.273503 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.285461 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.302993 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:18Z\\\",\\\"message\\\":\\\"ndler 3 for removal\\\\nI0120 18:31:18.345159 6816 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0120 18:31:18.345185 6816 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 18:31:18.345292 6816 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0120 18:31:18.345383 6816 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:31:18.345395 6816 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:31:18.345451 6816 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 18:31:18.345468 6816 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:31:18.345473 6816 factory.go:656] Stopping watch factory\\\\nI0120 18:31:18.345493 6816 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 18:31:18.345526 6816 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 18:31:18.345537 6816 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 18:31:18.345544 6816 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:31:18.345550 6816 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 18:31:18.345560 6816 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:31:18.345576 6816 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.315428 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.398181 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 20:12:28.906482044 +0000 UTC Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420569 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.446659 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:19 crc kubenswrapper[4773]: E0120 18:31:19.446862 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522874 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728535 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831452 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934235 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037962 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.074546 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.078866 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.079110 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.093907 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.107133 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.118884 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.140284 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:18Z\\\",\\\"message\\\":\\\"ndler 3 for removal\\\\nI0120 18:31:18.345159 6816 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0120 18:31:18.345185 6816 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 18:31:18.345292 6816 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0120 18:31:18.345383 6816 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:31:18.345395 6816 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:31:18.345451 6816 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 18:31:18.345468 6816 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:31:18.345473 6816 factory.go:656] Stopping watch factory\\\\nI0120 18:31:18.345493 6816 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 18:31:18.345526 6816 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 18:31:18.345537 6816 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 18:31:18.345544 6816 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:31:18.345550 6816 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 18:31:18.345560 6816 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:31:18.345576 6816 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141402 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.155162 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.167390 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.184269 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.197616 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.216457 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.229182 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.243761 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244821 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.261465 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.282610 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.295200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.309871 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.327649 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.346332 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347863 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.358566 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.399295 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 13:00:35.385821228 +0000 UTC Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.446812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.446893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.447014 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.447170 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.446816 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.447357 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450676 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552910 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655584 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758179 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860400 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962891 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065739 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168395 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168405 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270713 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270730 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.373789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374096 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.400250 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:15:42.826491996 +0000 UTC Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.446314 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:21 crc kubenswrapper[4773]: E0120 18:31:21.446453 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476026 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476184 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578778 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681926 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785086 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888354 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990854 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093191 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.195993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196046 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196144 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.400358 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:36:05.448689653 +0000 UTC Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.402005 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.447083 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.447238 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.447353 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:22 crc kubenswrapper[4773]: E0120 18:31:22.447262 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:22 crc kubenswrapper[4773]: E0120 18:31:22.447479 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:22 crc kubenswrapper[4773]: E0120 18:31:22.447532 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.504974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711603 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815232 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918796 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125518 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228720 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331772 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.400743 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 01:09:36.77589273 +0000 UTC Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433840 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.446616 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:23 crc kubenswrapper[4773]: E0120 18:31:23.446828 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537163 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639875 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742551 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742580 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845381 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947912 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050886 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050895 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153907 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256833 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359883 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.401212 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 14:56:22.267479221 +0000 UTC Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.446271 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.446297 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.446303 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.446502 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.446627 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.446760 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454685 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.476900 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481340 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.500023 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.505901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.505993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.506013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.506037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.506055 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.526052 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535915 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.558192 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.562991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563430 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.581588 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.581707 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.583967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584034 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687690 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790398 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893306 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893345 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996329 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201491 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303913 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303963 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.401844 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 12:48:57.815322028 +0000 UTC Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.408701 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.409561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.409725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.409874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.410030 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.446768 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:25 crc kubenswrapper[4773]: E0120 18:31:25.446991 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513240 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.615965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616826 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.720451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.720896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.721130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.721320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.721502 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824394 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927434 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029388 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.131547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.131813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.131966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.132062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.132161 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.235355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.235814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.236072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.236295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.236477 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339752 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.402053 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 18:46:55.989217178 +0000 UTC Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.443001 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.447079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.447079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:26 crc kubenswrapper[4773]: E0120 18:31:26.447338 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.447173 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:26 crc kubenswrapper[4773]: E0120 18:31:26.447618 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:26 crc kubenswrapper[4773]: E0120 18:31:26.447501 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.545748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546253 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648690 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750724 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853539 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956762 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059738 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059758 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.163003 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265554 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367894 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.402257 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 16:03:06.719857221 +0000 UTC Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.447064 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:27 crc kubenswrapper[4773]: E0120 18:31:27.447171 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.467556 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.471113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.471309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.471765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.472038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.472215 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.487848 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.511095 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.531096 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.546687 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.562192 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.577003 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.580463 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.598007 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.617585 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.630250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.642691 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.659711 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.674272 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678909 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678919 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.688887 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.701539 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.711754 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.732858 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:18Z\\\",\\\"message\\\":\\\"ndler 3 for removal\\\\nI0120 18:31:18.345159 6816 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0120 18:31:18.345185 6816 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 18:31:18.345292 6816 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0120 18:31:18.345383 6816 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:31:18.345395 6816 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:31:18.345451 6816 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 18:31:18.345468 6816 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:31:18.345473 6816 factory.go:656] Stopping watch factory\\\\nI0120 18:31:18.345493 6816 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 18:31:18.345526 6816 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 18:31:18.345537 6816 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 18:31:18.345544 6816 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:31:18.345550 6816 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 18:31:18.345560 6816 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:31:18.345576 6816 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.746408 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782146 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885456 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989823 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093409 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195853 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298312 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401849 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.402589 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 16:51:16.050657711 +0000 UTC Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.446075 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.446120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.446205 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:28 crc kubenswrapper[4773]: E0120 18:31:28.446270 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:28 crc kubenswrapper[4773]: E0120 18:31:28.446413 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:28 crc kubenswrapper[4773]: E0120 18:31:28.446576 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609331 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712398 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816218 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.918873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.918971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.918999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.919029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.919049 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.124918 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125082 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227553 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331600 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.403567 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:51:28.440835814 +0000 UTC Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.446697 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:29 crc kubenswrapper[4773]: E0120 18:31:29.446860 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538353 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642149 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.745017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848402 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.951967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952064 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056378 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159917 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.204610 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.204874 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.204834553 +0000 UTC m=+147.126647707 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.204999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.205172 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.205252 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.205231112 +0000 UTC m=+147.127044306 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263091 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263148 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.305920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.306002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.306042 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306130 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306142 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306177 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306191 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306193 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.306178188 +0000 UTC m=+147.227991222 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306246 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.306230899 +0000 UTC m=+147.228043913 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306381 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306457 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306485 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306596 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.306565618 +0000 UTC m=+147.228378822 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.404470 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 15:04:03.115125548 +0000 UTC Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.446377 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.446491 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.446524 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.446683 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.446683 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.446733 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.457719 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468689 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571383 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571410 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674194 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777508 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880480 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880526 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880538 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.982925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983053 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983093 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085853 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189136 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.291721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.292248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.292593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.293010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.293177 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396512 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.405362 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 23:15:48.729436583 +0000 UTC Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.447294 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:31 crc kubenswrapper[4773]: E0120 18:31:31.447527 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.501035 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706731 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706822 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.808564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.808796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.809028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.809257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.809459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.912965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913034 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.015966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016099 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118155 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.221884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.221983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.222009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.222038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.222060 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.325895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.325989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.326007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.326072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.326102 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.407064 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 22:25:40.487737774 +0000 UTC Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428975 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.446851 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.446991 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:32 crc kubenswrapper[4773]: E0120 18:31:32.447065 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:32 crc kubenswrapper[4773]: E0120 18:31:32.447159 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.446881 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:32 crc kubenswrapper[4773]: E0120 18:31:32.447315 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.530960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.530990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.530998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.531011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.531019 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633623 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736175 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.839004 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941373 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.043988 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044063 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146087 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146151 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248361 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351193 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351314 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.408500 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:13:10.556703074 +0000 UTC Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.447144 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:33 crc kubenswrapper[4773]: E0120 18:31:33.447358 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.448373 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:33 crc kubenswrapper[4773]: E0120 18:31:33.448644 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453706 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556729 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660207 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763297 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763316 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969497 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969559 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072740 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175969 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277919 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277981 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.409393 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 00:34:21.519649853 +0000 UTC Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.446872 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.446889 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.446908 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.447240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.447398 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.447573 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483809 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586723 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586795 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.689993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793854 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896809 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.983589 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987993 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.011153 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016774 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.036014 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040088 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.053837 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057573 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.070583 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.070818 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073189 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176353 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279358 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382422 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.409862 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:09:50.125099781 +0000 UTC Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.446694 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.447060 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485764 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692726 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692737 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796310 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900330 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211432 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315317 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.410485 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 13:08:26.042562535 +0000 UTC Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418212 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.446636 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.446699 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:36 crc kubenswrapper[4773]: E0120 18:31:36.446807 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.446642 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:36 crc kubenswrapper[4773]: E0120 18:31:36.446997 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:36 crc kubenswrapper[4773]: E0120 18:31:36.447244 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.523979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627360 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730603 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832833 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935225 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038212 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140986 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347166 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.450202 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 00:02:40.284262618 +0000 UTC Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.451092 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:37 crc kubenswrapper[4773]: E0120 18:31:37.454094 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.470870 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.486007 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.506764 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.525691 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.558740 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559595 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.579724 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.639733 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" podStartSLOduration=71.639709856 podStartE2EDuration="1m11.639709856s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.639232095 +0000 UTC m=+90.561045149" watchObservedRunningTime="2026-01-20 18:31:37.639709856 +0000 UTC m=+90.561522910" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663708 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.667451 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bccxn" podStartSLOduration=71.66742618399999 podStartE2EDuration="1m11.667426184s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.666084252 +0000 UTC m=+90.587897376" watchObservedRunningTime="2026-01-20 18:31:37.667426184 +0000 UTC m=+90.589239238" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.703373 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.703319086 podStartE2EDuration="38.703319086s" podCreationTimestamp="2026-01-20 18:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.701757709 +0000 UTC m=+90.623570743" watchObservedRunningTime="2026-01-20 18:31:37.703319086 +0000 UTC m=+90.625132120" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.704230 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" podStartSLOduration=70.704222448 podStartE2EDuration="1m10.704222448s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.684217033 +0000 UTC m=+90.606030067" watchObservedRunningTime="2026-01-20 18:31:37.704222448 +0000 UTC m=+90.626035482" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.714167 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.714148013 podStartE2EDuration="7.714148013s" podCreationTimestamp="2026-01-20 18:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.713375015 +0000 UTC m=+90.635188049" watchObservedRunningTime="2026-01-20 18:31:37.714148013 +0000 UTC m=+90.635961047" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.759703 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podStartSLOduration=71.759685865 podStartE2EDuration="1m11.759685865s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.731748121 +0000 UTC m=+90.653561155" watchObservedRunningTime="2026-01-20 18:31:37.759685865 +0000 UTC m=+90.681498909" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766546 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.820474 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.820452317 podStartE2EDuration="1m10.820452317s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.819225948 +0000 UTC m=+90.741038972" watchObservedRunningTime="2026-01-20 18:31:37.820452317 +0000 UTC m=+90.742265341" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868684 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.971482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.971878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.972052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.972187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.972310 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075729 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075790 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178536 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281798 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384791 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.446785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.446865 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:38 crc kubenswrapper[4773]: E0120 18:31:38.446906 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:38 crc kubenswrapper[4773]: E0120 18:31:38.447119 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.447216 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:38 crc kubenswrapper[4773]: E0120 18:31:38.447416 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.450368 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:55:46.2000213 +0000 UTC Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488150 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591327 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693682 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796599 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.899958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900120 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003190 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003267 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107172 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209996 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312916 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416226 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.446767 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:39 crc kubenswrapper[4773]: E0120 18:31:39.447966 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.450664 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:11:18.270772508 +0000 UTC Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519620 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622197 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724977 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.725001 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.827990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828114 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.930915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931263 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035377 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139156 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242428 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345534 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345551 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.447174 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.447244 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.447181 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:40 crc kubenswrapper[4773]: E0120 18:31:40.447497 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:40 crc kubenswrapper[4773]: E0120 18:31:40.447653 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:40 crc kubenswrapper[4773]: E0120 18:31:40.448054 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449240 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.451591 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 03:11:23.241689854 +0000 UTC Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551980 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655141 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757390 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860520 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964583 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067423 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067434 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170650 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273314 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.376855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.376996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.377024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.377058 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.377081 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.447043 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:41 crc kubenswrapper[4773]: E0120 18:31:41.447250 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.452011 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 15:42:55.997824451 +0000 UTC Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480461 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480500 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.583857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.583965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.583994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.584024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.584045 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687534 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687848 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.790900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.790973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.790986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.791004 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.791017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894274 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100627 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100688 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203142 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306485 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410184 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.446975 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.447097 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:42 crc kubenswrapper[4773]: E0120 18:31:42.447247 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.447335 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:42 crc kubenswrapper[4773]: E0120 18:31:42.447531 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:42 crc kubenswrapper[4773]: E0120 18:31:42.447768 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.452322 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 16:04:45.255468762 +0000 UTC Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513645 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615993 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718404 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718549 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822813 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925507 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925684 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028158 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131670 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.235009 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338425 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338540 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442893 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.447192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:43 crc kubenswrapper[4773]: E0120 18:31:43.447410 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.452479 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 18:49:05.329682131 +0000 UTC Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.546976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547100 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650681 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650700 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753784 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857650 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857712 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.960003 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062700 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164748 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268406 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372440 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372506 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.447189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.447433 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.447773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.447894 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.448169 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.448280 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.453491 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 19:21:16.739862731 +0000 UTC Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.475966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476030 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476090 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.578876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579533 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579815 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682262 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784667 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.868598 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.868915 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.869082 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:48.869050448 +0000 UTC m=+161.790863512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888178 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991765 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094975 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.197985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198158 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301848 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340231 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.399141 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc"] Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.399492 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.402575 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.402985 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.403240 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.403439 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.436161 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.436142342 podStartE2EDuration="1m19.436142342s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.435959428 +0000 UTC m=+98.357772452" watchObservedRunningTime="2026-01-20 18:31:45.436142342 +0000 UTC m=+98.357955376" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.446656 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:45 crc kubenswrapper[4773]: E0120 18:31:45.446788 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.453658 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 17:37:23.611407843 +0000 UTC Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.453711 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.459665 4773 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.462700 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gczfj" podStartSLOduration=79.462686612 podStartE2EDuration="1m19.462686612s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.462250922 +0000 UTC m=+98.384063946" watchObservedRunningTime="2026-01-20 18:31:45.462686612 +0000 UTC m=+98.384499636" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.473952 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5sv79" podStartSLOduration=79.473907789 podStartE2EDuration="1m19.473907789s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.473059129 +0000 UTC m=+98.394872143" watchObservedRunningTime="2026-01-20 18:31:45.473907789 +0000 UTC m=+98.395720823" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476545 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25126370-c138-4fa2-af29-896492cb6a1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476596 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25126370-c138-4fa2-af29-896492cb6a1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25126370-c138-4fa2-af29-896492cb6a1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.497556 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.49753396 podStartE2EDuration="1m16.49753396s" podCreationTimestamp="2026-01-20 18:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.495511792 +0000 UTC m=+98.417324816" watchObservedRunningTime="2026-01-20 18:31:45.49753396 +0000 UTC m=+98.419347004" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25126370-c138-4fa2-af29-896492cb6a1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25126370-c138-4fa2-af29-896492cb6a1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578346 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25126370-c138-4fa2-af29-896492cb6a1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578447 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578494 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.579926 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25126370-c138-4fa2-af29-896492cb6a1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.590366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25126370-c138-4fa2-af29-896492cb6a1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.609503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25126370-c138-4fa2-af29-896492cb6a1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.722404 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: W0120 18:31:45.747306 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25126370_c138_4fa2_af29_896492cb6a1c.slice/crio-6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583 WatchSource:0}: Error finding container 6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583: Status 404 returned error can't find the container with id 6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583 Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.161060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" event={"ID":"25126370-c138-4fa2-af29-896492cb6a1c","Type":"ContainerStarted","Data":"3c2764b6b0fd4f90ece02472679aeef92be73c90d7df4455940bfa1908f4245d"} Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.161107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" event={"ID":"25126370-c138-4fa2-af29-896492cb6a1c","Type":"ContainerStarted","Data":"6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583"} Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.178875 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" podStartSLOduration=80.178841016 podStartE2EDuration="1m20.178841016s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:46.177459813 +0000 UTC m=+99.099272837" watchObservedRunningTime="2026-01-20 18:31:46.178841016 +0000 UTC m=+99.100654060" Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.446374 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.446436 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.446570 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:46 crc kubenswrapper[4773]: E0120 18:31:46.446665 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:46 crc kubenswrapper[4773]: E0120 18:31:46.446823 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:46 crc kubenswrapper[4773]: E0120 18:31:46.446889 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:47 crc kubenswrapper[4773]: I0120 18:31:47.446261 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:47 crc kubenswrapper[4773]: E0120 18:31:47.448351 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:47 crc kubenswrapper[4773]: I0120 18:31:47.449909 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:47 crc kubenswrapper[4773]: E0120 18:31:47.450418 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:48 crc kubenswrapper[4773]: I0120 18:31:48.446421 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:48 crc kubenswrapper[4773]: E0120 18:31:48.446737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:48 crc kubenswrapper[4773]: I0120 18:31:48.446910 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:48 crc kubenswrapper[4773]: E0120 18:31:48.447170 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:48 crc kubenswrapper[4773]: I0120 18:31:48.447267 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:48 crc kubenswrapper[4773]: E0120 18:31:48.447515 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:49 crc kubenswrapper[4773]: I0120 18:31:49.446913 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:49 crc kubenswrapper[4773]: E0120 18:31:49.447154 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:50 crc kubenswrapper[4773]: I0120 18:31:50.446091 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:50 crc kubenswrapper[4773]: I0120 18:31:50.446092 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:50 crc kubenswrapper[4773]: I0120 18:31:50.446171 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:50 crc kubenswrapper[4773]: E0120 18:31:50.446898 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:50 crc kubenswrapper[4773]: E0120 18:31:50.447107 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:50 crc kubenswrapper[4773]: E0120 18:31:50.447106 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:51 crc kubenswrapper[4773]: I0120 18:31:51.446206 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:51 crc kubenswrapper[4773]: E0120 18:31:51.446475 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:52 crc kubenswrapper[4773]: I0120 18:31:52.447102 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:52 crc kubenswrapper[4773]: I0120 18:31:52.447102 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:52 crc kubenswrapper[4773]: E0120 18:31:52.447301 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:52 crc kubenswrapper[4773]: E0120 18:31:52.447466 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:52 crc kubenswrapper[4773]: I0120 18:31:52.447117 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:52 crc kubenswrapper[4773]: E0120 18:31:52.447778 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:53 crc kubenswrapper[4773]: I0120 18:31:53.446555 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:53 crc kubenswrapper[4773]: E0120 18:31:53.446840 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:54 crc kubenswrapper[4773]: I0120 18:31:54.446041 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:54 crc kubenswrapper[4773]: I0120 18:31:54.446186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:54 crc kubenswrapper[4773]: E0120 18:31:54.446327 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:54 crc kubenswrapper[4773]: E0120 18:31:54.446322 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:54 crc kubenswrapper[4773]: I0120 18:31:54.446445 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:54 crc kubenswrapper[4773]: E0120 18:31:54.446775 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:55 crc kubenswrapper[4773]: I0120 18:31:55.446899 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:55 crc kubenswrapper[4773]: E0120 18:31:55.447062 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:56 crc kubenswrapper[4773]: I0120 18:31:56.447245 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:56 crc kubenswrapper[4773]: I0120 18:31:56.447297 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:56 crc kubenswrapper[4773]: I0120 18:31:56.447254 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:56 crc kubenswrapper[4773]: E0120 18:31:56.447479 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:56 crc kubenswrapper[4773]: E0120 18:31:56.447579 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:56 crc kubenswrapper[4773]: E0120 18:31:56.447684 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:57 crc kubenswrapper[4773]: I0120 18:31:57.446497 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:57 crc kubenswrapper[4773]: E0120 18:31:57.448520 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.446191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.446192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.446267 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.446955 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.447559 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.447808 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.447971 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.448075 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:59 crc kubenswrapper[4773]: I0120 18:31:59.446312 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:59 crc kubenswrapper[4773]: E0120 18:31:59.446480 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.207306 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208106 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/0.log" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208195 4773 generic.go:334] "Generic (PLEG): container finished" podID="061a607e-1868-4fcf-b3ea-d51157511d41" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" exitCode=1 Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208246 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerDied","Data":"dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7"} Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208305 4773 scope.go:117] "RemoveContainer" containerID="5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208724 4773 scope.go:117] "RemoveContainer" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.209009 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bccxn_openshift-multus(061a607e-1868-4fcf-b3ea-d51157511d41)\"" pod="openshift-multus/multus-bccxn" podUID="061a607e-1868-4fcf-b3ea-d51157511d41" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.446706 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.446807 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.446885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.446875 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.447051 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.447154 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:01 crc kubenswrapper[4773]: I0120 18:32:01.214625 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:32:01 crc kubenswrapper[4773]: I0120 18:32:01.447110 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:01 crc kubenswrapper[4773]: E0120 18:32:01.447247 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:02 crc kubenswrapper[4773]: I0120 18:32:02.446659 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:02 crc kubenswrapper[4773]: I0120 18:32:02.446692 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:02 crc kubenswrapper[4773]: I0120 18:32:02.447601 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:02 crc kubenswrapper[4773]: E0120 18:32:02.447800 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:02 crc kubenswrapper[4773]: E0120 18:32:02.448152 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:02 crc kubenswrapper[4773]: E0120 18:32:02.448232 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:03 crc kubenswrapper[4773]: I0120 18:32:03.447077 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:03 crc kubenswrapper[4773]: E0120 18:32:03.447240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:04 crc kubenswrapper[4773]: I0120 18:32:04.446416 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:04 crc kubenswrapper[4773]: I0120 18:32:04.446489 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:04 crc kubenswrapper[4773]: I0120 18:32:04.446416 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:04 crc kubenswrapper[4773]: E0120 18:32:04.446578 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:04 crc kubenswrapper[4773]: E0120 18:32:04.446678 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:04 crc kubenswrapper[4773]: E0120 18:32:04.446801 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:05 crc kubenswrapper[4773]: I0120 18:32:05.446978 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:05 crc kubenswrapper[4773]: E0120 18:32:05.447108 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:06 crc kubenswrapper[4773]: I0120 18:32:06.446041 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:06 crc kubenswrapper[4773]: E0120 18:32:06.446440 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:06 crc kubenswrapper[4773]: I0120 18:32:06.446189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:06 crc kubenswrapper[4773]: I0120 18:32:06.446159 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:06 crc kubenswrapper[4773]: E0120 18:32:06.446703 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:06 crc kubenswrapper[4773]: E0120 18:32:06.446949 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:07 crc kubenswrapper[4773]: E0120 18:32:07.388589 4773 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 20 18:32:07 crc kubenswrapper[4773]: I0120 18:32:07.446638 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:07 crc kubenswrapper[4773]: E0120 18:32:07.448616 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:07 crc kubenswrapper[4773]: E0120 18:32:07.657858 4773 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 20 18:32:08 crc kubenswrapper[4773]: I0120 18:32:08.446080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:08 crc kubenswrapper[4773]: I0120 18:32:08.446154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:08 crc kubenswrapper[4773]: I0120 18:32:08.446201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:08 crc kubenswrapper[4773]: E0120 18:32:08.446818 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:08 crc kubenswrapper[4773]: E0120 18:32:08.447160 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:08 crc kubenswrapper[4773]: E0120 18:32:08.447279 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:09 crc kubenswrapper[4773]: I0120 18:32:09.454197 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:09 crc kubenswrapper[4773]: E0120 18:32:09.455373 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:10 crc kubenswrapper[4773]: I0120 18:32:10.446363 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:10 crc kubenswrapper[4773]: I0120 18:32:10.446458 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:10 crc kubenswrapper[4773]: E0120 18:32:10.446539 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:10 crc kubenswrapper[4773]: E0120 18:32:10.446688 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:10 crc kubenswrapper[4773]: I0120 18:32:10.446855 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:10 crc kubenswrapper[4773]: E0120 18:32:10.447127 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:11 crc kubenswrapper[4773]: I0120 18:32:11.446774 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:11 crc kubenswrapper[4773]: E0120 18:32:11.446971 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:12 crc kubenswrapper[4773]: I0120 18:32:12.447023 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:12 crc kubenswrapper[4773]: I0120 18:32:12.447066 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:12 crc kubenswrapper[4773]: I0120 18:32:12.447115 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.447210 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.447277 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.447377 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.660046 4773 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 20 18:32:13 crc kubenswrapper[4773]: I0120 18:32:13.446922 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:13 crc kubenswrapper[4773]: E0120 18:32:13.447079 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:13 crc kubenswrapper[4773]: I0120 18:32:13.447859 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.252633 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.254923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0"} Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.255366 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.281314 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podStartSLOduration=108.281296109 podStartE2EDuration="1m48.281296109s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:14.279308361 +0000 UTC m=+127.201121385" watchObservedRunningTime="2026-01-20 18:32:14.281296109 +0000 UTC m=+127.203109133" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.446544 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.446608 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:14 crc kubenswrapper[4773]: E0120 18:32:14.446710 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.446620 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:14 crc kubenswrapper[4773]: E0120 18:32:14.446861 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:14 crc kubenswrapper[4773]: E0120 18:32:14.447091 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.468234 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jpbd"] Jan 20 18:32:15 crc kubenswrapper[4773]: I0120 18:32:15.257911 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:15 crc kubenswrapper[4773]: E0120 18:32:15.258284 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:15 crc kubenswrapper[4773]: I0120 18:32:15.446619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:15 crc kubenswrapper[4773]: E0120 18:32:15.447072 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:15 crc kubenswrapper[4773]: I0120 18:32:15.447281 4773 scope.go:117] "RemoveContainer" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.262362 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.263089 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65"} Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.446919 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.447097 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:16 crc kubenswrapper[4773]: E0120 18:32:16.447103 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.447142 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:16 crc kubenswrapper[4773]: E0120 18:32:16.447228 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:16 crc kubenswrapper[4773]: E0120 18:32:16.447375 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:17 crc kubenswrapper[4773]: I0120 18:32:17.446201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:17 crc kubenswrapper[4773]: E0120 18:32:17.447326 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.446370 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.446414 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.446515 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.450796 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.450887 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451127 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451167 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451488 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451404 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 18:32:19 crc kubenswrapper[4773]: I0120 18:32:19.446961 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:20 crc kubenswrapper[4773]: I0120 18:32:20.365125 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.825595 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.875040 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7zslm"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876007 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bhrll"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876256 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876726 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876973 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.877534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.877826 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.878345 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.879689 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jhgc"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.880045 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.881753 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.882037 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.882122 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.884257 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.884341 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.884802 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885078 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885102 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885693 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885210 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885429 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.886427 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.887065 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.887495 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.887845 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888187 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888240 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888387 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.898762 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.897829 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.901919 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.909805 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.927169 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.927795 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.928071 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.928439 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.928814 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.929579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931128 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-node-pullsecrets\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931180 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-serving-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931231 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-audit\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82jpq\" (UniqueName: \"kubernetes.io/projected/49deabd4-ebbe-4c07-bb79-105982db000a-kube-api-access-82jpq\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931271 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-serving-cert\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931320 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-auth-proxy-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931358 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4kb\" (UniqueName: \"kubernetes.io/projected/411d251b-6daa-4c45-9aeb-aa38def60a90-kube-api-access-sl4kb\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931377 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a99225b3-64c7-4b39-807c-c97faa919977-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-audit-dir\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931439 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/411d251b-6daa-4c45-9aeb-aa38def60a90-serving-cert\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931477 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-image-import-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931477 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-config\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931723 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931739 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/330a8450-c400-425d-9a46-e868a02fca27-machine-approver-tls\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931769 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-images\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931800 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bts76\" (UniqueName: \"kubernetes.io/projected/330a8450-c400-425d-9a46-e868a02fca27-kube-api-access-bts76\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931843 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-encryption-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931858 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-config\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931883 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtnm\" (UniqueName: \"kubernetes.io/projected/a99225b3-64c7-4b39-807c-c97faa919977-kube-api-access-mbtnm\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931902 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-client\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.933986 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.934553 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935103 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935647 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935740 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.937494 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.938105 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.938607 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xwc5v"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939103 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939191 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939407 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939488 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.941362 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.946355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.946836 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.947118 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.950629 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.950867 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951045 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951192 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951295 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951403 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951538 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951735 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951905 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.952712 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.953459 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.953626 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.954001 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bkbfc"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.954222 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.954591 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.955280 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960150 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960396 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960669 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960907 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.961462 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.961618 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.963046 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.963261 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.963809 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.964691 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.965138 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.965342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.965422 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.966949 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967177 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967422 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967678 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967824 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967953 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968074 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968103 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968145 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968252 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968280 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968368 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968435 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968494 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.987407 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.988259 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.989450 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.993146 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.993453 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.993653 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994240 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994305 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994394 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994300 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.995575 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011258 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011304 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011503 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011683 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011760 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011798 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012509 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012603 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012835 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012924 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.013175 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.013762 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.014004 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.019384 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.019996 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wmrt"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.020359 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.020404 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.024530 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.025188 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.027336 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.030367 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-x95ml"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.031029 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pfxs9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.031624 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.032128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.033213 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.034759 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.035085 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.036400 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.041823 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bhrll"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.038447 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.039165 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.038272 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d63bfb8-8ecc-43f3-8931-cc09c815c580-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.042995 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqxj\" (UniqueName: \"kubernetes.io/projected/79162e32-ee8c-4fcc-8911-0f95d41cd110-kube-api-access-9vqxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bae1b17-1679-4be9-9717-66c5a80ad425-serving-cert\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntrkt\" (UniqueName: \"kubernetes.io/projected/e98bf97b-784a-4a99-8eff-20e6fc687876-kube-api-access-ntrkt\") pod \"migrator-59844c95c7-c989h\" (UID: \"e98bf97b-784a-4a99-8eff-20e6fc687876\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-config\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043720 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/330a8450-c400-425d-9a46-e868a02fca27-machine-approver-tls\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043955 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-images\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044035 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044113 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044286 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-serving-cert\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044359 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsp4s\" (UniqueName: \"kubernetes.io/projected/6bae1b17-1679-4be9-9717-66c5a80ad425-kube-api-access-dsp4s\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044436 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bts76\" (UniqueName: \"kubernetes.io/projected/330a8450-c400-425d-9a46-e868a02fca27-kube-api-access-bts76\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044593 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044662 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-config\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fd2de1-85c4-4f01-8524-7b93c777592d-config\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044816 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dzn2\" (UniqueName: \"kubernetes.io/projected/47548d0b-9447-4862-b717-9427ae40c49a-kube-api-access-5dzn2\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-encryption-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.046978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-config\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-client\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047202 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.046380 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x6fwb"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.042010 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047404 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047519 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047608 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.040823 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048765 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048858 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048775 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048880 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-images\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047216 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtnm\" (UniqueName: \"kubernetes.io/projected/a99225b3-64c7-4b39-807c-c97faa919977-kube-api-access-mbtnm\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-client\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049016 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzp5t\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-kube-api-access-hzp5t\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049073 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049091 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jhgc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049092 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65b41ab6-6253-4ee5-87f2-50ed05610e03-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049167 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-node-pullsecrets\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049177 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049191 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmmk6\" (UniqueName: \"kubernetes.io/projected/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-kube-api-access-dmmk6\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275484db-b3bc-4027-a1d7-a67ab3c71439-serving-cert\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-serving-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049261 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-encryption-config\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049317 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049345 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049373 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79162e32-ee8c-4fcc-8911-0f95d41cd110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049377 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-node-pullsecrets\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049392 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049394 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq64n\" (UniqueName: \"kubernetes.io/projected/bbce412e-616a-465b-bb42-da842edb8110-kube-api-access-pq64n\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-config\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-trusted-ca\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049674 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049694 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fd2de1-85c4-4f01-8524-7b93c777592d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049717 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47548d0b-9447-4862-b717-9427ae40c49a-metrics-tls\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049765 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-audit\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049845 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82jpq\" (UniqueName: \"kubernetes.io/projected/49deabd4-ebbe-4c07-bb79-105982db000a-kube-api-access-82jpq\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049871 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-serving-cert\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049883 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-serving-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjhr\" (UniqueName: \"kubernetes.io/projected/fec9cba4-b7cb-46ca-90a4-af0d5114fee8-kube-api-access-pzjhr\") pod \"downloads-7954f5f757-bkbfc\" (UID: \"fec9cba4-b7cb-46ca-90a4-af0d5114fee8\") " pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049958 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050025 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050061 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050095 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-serving-cert\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050143 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-auth-proxy-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050163 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050182 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63fd2de1-85c4-4f01-8524-7b93c777592d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050205 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-etcd-client\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050225 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/275484db-b3bc-4027-a1d7-a67ab3c71439-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050245 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79162e32-ee8c-4fcc-8911-0f95d41cd110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050268 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-policies\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050327 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqrr\" (UniqueName: \"kubernetes.io/projected/275484db-b3bc-4027-a1d7-a67ab3c71439-kube-api-access-4rqrr\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050345 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-config\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050365 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7pps\" (UniqueName: \"kubernetes.io/projected/9d63bfb8-8ecc-43f3-8931-cc09c815c580-kube-api-access-k7pps\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4kb\" (UniqueName: \"kubernetes.io/projected/411d251b-6daa-4c45-9aeb-aa38def60a90-kube-api-access-sl4kb\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050405 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050438 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-dir\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050548 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050567 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d63bfb8-8ecc-43f3-8931-cc09c815c580-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050584 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrpqv\" (UniqueName: \"kubernetes.io/projected/b3570207-5cb9-4481-a15a-d0bb9312a84b-kube-api-access-xrpqv\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-service-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050811 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a99225b3-64c7-4b39-807c-c97faa919977-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050839 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-audit-dir\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050880 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/411d251b-6daa-4c45-9aeb-aa38def60a90-serving-cert\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65b41ab6-6253-4ee5-87f2-50ed05610e03-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-image-import-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051012 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.052242 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-encryption-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.052441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-config\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.052807 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.053032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-audit-dir\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.053652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.054059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-serving-cert\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.054151 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-image-import-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055082 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-audit\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055149 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/411d251b-6daa-4c45-9aeb-aa38def60a90-serving-cert\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055889 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-config\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055915 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055946 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.056496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.056992 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-auth-proxy-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.057107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/330a8450-c400-425d-9a46-e868a02fca27-machine-approver-tls\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.057471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a99225b3-64c7-4b39-807c-c97faa919977-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.058578 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.058860 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.060967 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.063306 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.064738 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.067904 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.069616 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.069897 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.073329 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.074268 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7zslm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.074380 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.075951 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.078208 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.078494 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.078502 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.080261 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.088830 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.089481 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-client\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.090047 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.092018 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.093681 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.094003 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.094328 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.096371 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.098116 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.098829 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.099170 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.099486 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.103430 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cv7zc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.103873 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.107163 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llx65"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.107846 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.109620 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ks6ps"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.110504 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.111034 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.112464 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8vpz"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.113811 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.113924 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.114598 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.118549 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.118599 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.118622 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wmrt"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.119407 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.120586 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.121655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.122775 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.123835 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pfxs9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.124970 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bkbfc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.126306 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xwc5v"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.127319 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.128353 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.129411 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.130451 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.131435 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.132440 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.133420 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5t8h7"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.134622 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7j8nw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.134747 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.135342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.135632 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.138109 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.139054 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.139186 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.140159 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.141200 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.142206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.143201 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.144219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.145250 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x6fwb"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.146258 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ks6ps"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.147256 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.148334 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llx65"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.149398 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8vpz"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.150646 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.151530 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-client\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152607 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzp5t\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-kube-api-access-hzp5t\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152647 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65b41ab6-6253-4ee5-87f2-50ed05610e03-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152700 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152791 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmmk6\" (UniqueName: \"kubernetes.io/projected/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-kube-api-access-dmmk6\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275484db-b3bc-4027-a1d7-a67ab3c71439-serving-cert\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-encryption-config\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152899 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79162e32-ee8c-4fcc-8911-0f95d41cd110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152976 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153001 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153038 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq64n\" (UniqueName: \"kubernetes.io/projected/bbce412e-616a-465b-bb42-da842edb8110-kube-api-access-pq64n\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153057 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-config\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-trusted-ca\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153095 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fd2de1-85c4-4f01-8524-7b93c777592d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153136 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153139 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47548d0b-9447-4862-b717-9427ae40c49a-metrics-tls\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153249 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-serving-cert\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjhr\" (UniqueName: \"kubernetes.io/projected/fec9cba4-b7cb-46ca-90a4-af0d5114fee8-kube-api-access-pzjhr\") pod \"downloads-7954f5f757-bkbfc\" (UID: \"fec9cba4-b7cb-46ca-90a4-af0d5114fee8\") " pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153292 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153320 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153371 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153392 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63fd2de1-85c4-4f01-8524-7b93c777592d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153411 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-etcd-client\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153430 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/275484db-b3bc-4027-a1d7-a67ab3c71439-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153449 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79162e32-ee8c-4fcc-8911-0f95d41cd110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153471 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153488 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-policies\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153513 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqrr\" (UniqueName: \"kubernetes.io/projected/275484db-b3bc-4027-a1d7-a67ab3c71439-kube-api-access-4rqrr\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-config\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7pps\" (UniqueName: \"kubernetes.io/projected/9d63bfb8-8ecc-43f3-8931-cc09c815c580-kube-api-access-k7pps\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153640 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-dir\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153950 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153703 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d63bfb8-8ecc-43f3-8931-cc09c815c580-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrpqv\" (UniqueName: \"kubernetes.io/projected/b3570207-5cb9-4481-a15a-d0bb9312a84b-kube-api-access-xrpqv\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154279 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-service-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65b41ab6-6253-4ee5-87f2-50ed05610e03-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d63bfb8-8ecc-43f3-8931-cc09c815c580-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqxj\" (UniqueName: \"kubernetes.io/projected/79162e32-ee8c-4fcc-8911-0f95d41cd110-kube-api-access-9vqxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154492 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bae1b17-1679-4be9-9717-66c5a80ad425-serving-cert\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154547 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntrkt\" (UniqueName: \"kubernetes.io/projected/e98bf97b-784a-4a99-8eff-20e6fc687876-kube-api-access-ntrkt\") pod \"migrator-59844c95c7-c989h\" (UID: \"e98bf97b-784a-4a99-8eff-20e6fc687876\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154586 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154643 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-serving-cert\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsp4s\" (UniqueName: \"kubernetes.io/projected/6bae1b17-1679-4be9-9717-66c5a80ad425-kube-api-access-dsp4s\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154703 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154728 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154751 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-config\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154771 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fd2de1-85c4-4f01-8524-7b93c777592d-config\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154807 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dzn2\" (UniqueName: \"kubernetes.io/projected/47548d0b-9447-4862-b717-9427ae40c49a-kube-api-access-5dzn2\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.156712 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.156684 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.156887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d63bfb8-8ecc-43f3-8931-cc09c815c580-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.157057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.157373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-trusted-ca\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.157862 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-config\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.158443 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.159305 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-config\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.159579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65b41ab6-6253-4ee5-87f2-50ed05610e03-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.160469 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.160650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79162e32-ee8c-4fcc-8911-0f95d41cd110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161023 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161109 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-serving-cert\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161525 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.162178 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/275484db-b3bc-4027-a1d7-a67ab3c71439-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.162761 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-policies\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.163142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-dir\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.163636 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164092 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164489 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164688 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d63bfb8-8ecc-43f3-8931-cc09c815c580-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164982 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79162e32-ee8c-4fcc-8911-0f95d41cd110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.165448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-client\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.167140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.168482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.168540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.169148 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j8nw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.169545 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.173113 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5t8h7"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.174516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65b41ab6-6253-4ee5-87f2-50ed05610e03-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.175707 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.177773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.178044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.178078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275484db-b3bc-4027-a1d7-a67ab3c71439-serving-cert\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.178747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bae1b17-1679-4be9-9717-66c5a80ad425-serving-cert\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.179253 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.179373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.186290 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.190597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-encryption-config\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.190710 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.214700 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.222326 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.240654 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.249781 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-serving-cert\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.257984 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.263695 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-etcd-client\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.278090 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.298793 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.303265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-config\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.319154 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.328697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fd2de1-85c4-4f01-8524-7b93c777592d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.338918 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.344859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.358648 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.365001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-service-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.378690 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.388525 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fd2de1-85c4-4f01-8524-7b93c777592d-config\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.398728 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.419155 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.438274 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.458633 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.479116 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.486371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47548d0b-9447-4862-b717-9427ae40c49a-metrics-tls\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.498528 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.519163 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.538950 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.558668 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.579448 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.599228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.618825 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.678015 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.698271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.718864 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.753669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtnm\" (UniqueName: \"kubernetes.io/projected/a99225b3-64c7-4b39-807c-c97faa919977-kube-api-access-mbtnm\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.771007 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bts76\" (UniqueName: \"kubernetes.io/projected/330a8450-c400-425d-9a46-e868a02fca27-kube-api-access-bts76\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.779285 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.798511 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.833888 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.839013 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.848512 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.860154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.880690 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.886873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.899820 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.918075 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.944901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82jpq\" (UniqueName: \"kubernetes.io/projected/49deabd4-ebbe-4c07-bb79-105982db000a-kube-api-access-82jpq\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.954396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4kb\" (UniqueName: \"kubernetes.io/projected/411d251b-6daa-4c45-9aeb-aa38def60a90-kube-api-access-sl4kb\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.956300 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.961276 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.979588 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.999360 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.025459 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.038453 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bhrll"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.039275 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.049676 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99225b3_64c7_4b39_807c_c97faa919977.slice/crio-f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf WatchSource:0}: Error finding container f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf: Status 404 returned error can't find the container with id f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.059480 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.078344 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.078438 4773 request.go:700] Waited for 1.008339815s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.080466 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.099018 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.120843 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.133102 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.138376 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.140670 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.149454 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d49ef4e_91fb_4b98_89d9_65358c718967.slice/crio-3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632 WatchSource:0}: Error finding container 3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632: Status 404 returned error can't find the container with id 3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632 Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.158715 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.178812 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.199395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.219753 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.227752 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.239229 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.259815 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.279228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.298222 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.302222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" event={"ID":"330a8450-c400-425d-9a46-e868a02fca27","Type":"ContainerStarted","Data":"b0881d57fbbc77ac5ee40bd606c8fb06314b176343932d84de3ec1f7a4c35da9"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.302856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" event={"ID":"a99225b3-64c7-4b39-807c-c97faa919977","Type":"ContainerStarted","Data":"f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.303583 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerStarted","Data":"3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.304317 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerStarted","Data":"0be8a6611d182b19e3c280dba8bf32b32d7fa146a5cc6f6279d1419ba167e1bf"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.314061 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7zslm"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.318582 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.328748 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49deabd4_ebbe_4c07_bb79_105982db000a.slice/crio-96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28 WatchSource:0}: Error finding container 96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28: Status 404 returned error can't find the container with id 96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28 Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.338511 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.357783 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.378620 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.394367 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jhgc"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.399590 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.418385 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.438549 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.459641 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.479710 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.498757 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.518714 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.539296 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.558350 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.578316 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.599280 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.619005 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.638852 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.658875 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.678919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.710136 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.719130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.739853 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.758961 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.779826 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.799549 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.819111 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.842079 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.846616 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod411d251b_6daa_4c45_9aeb_aa38def60a90.slice/crio-4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e WatchSource:0}: Error finding container 4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e: Status 404 returned error can't find the container with id 4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.877593 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.878374 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.898384 4773 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.919372 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.939220 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.958626 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.979724 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.999459 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.019644 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.038741 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.059029 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.097180 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzp5t\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-kube-api-access-hzp5t\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.097321 4773 request.go:700] Waited for 1.943418555s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.117760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmmk6\" (UniqueName: \"kubernetes.io/projected/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-kube-api-access-dmmk6\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.133351 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dzn2\" (UniqueName: \"kubernetes.io/projected/47548d0b-9447-4862-b717-9427ae40c49a-kube-api-access-5dzn2\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.158103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsp4s\" (UniqueName: \"kubernetes.io/projected/6bae1b17-1679-4be9-9717-66c5a80ad425-kube-api-access-dsp4s\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.170803 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.170868 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.176048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq64n\" (UniqueName: \"kubernetes.io/projected/bbce412e-616a-465b-bb42-da842edb8110-kube-api-access-pq64n\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.192243 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.211088 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqxj\" (UniqueName: \"kubernetes.io/projected/79162e32-ee8c-4fcc-8911-0f95d41cd110-kube-api-access-9vqxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.222369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.230058 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntrkt\" (UniqueName: \"kubernetes.io/projected/e98bf97b-784a-4a99-8eff-20e6fc687876-kube-api-access-ntrkt\") pod \"migrator-59844c95c7-c989h\" (UID: \"e98bf97b-784a-4a99-8eff-20e6fc687876\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.238051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.245398 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.253791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjhr\" (UniqueName: \"kubernetes.io/projected/fec9cba4-b7cb-46ca-90a4-af0d5114fee8-kube-api-access-pzjhr\") pod \"downloads-7954f5f757-bkbfc\" (UID: \"fec9cba4-b7cb-46ca-90a4-af0d5114fee8\") " pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.265978 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.275678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.295728 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.317506 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" event={"ID":"330a8450-c400-425d-9a46-e868a02fca27","Type":"ContainerStarted","Data":"d980837bf111ca1f394543e6e39fde1c816899c4e72eae4a1d98c93ac335fb11"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.319400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" event={"ID":"a99225b3-64c7-4b39-807c-c97faa919977","Type":"ContainerStarted","Data":"79654ac68476f4932f494a5ac7dc8a6128258838b2997ca43f88dc63b1ac8fc0"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.321090 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerStarted","Data":"96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.322974 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerStarted","Data":"bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.323202 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.328106 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.328163 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.329490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.331543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerStarted","Data":"0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.332707 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.333564 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" event={"ID":"411d251b-6daa-4c45-9aeb-aa38def60a90","Type":"ContainerStarted","Data":"4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.334829 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.334868 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.335115 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.347696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63fd2de1-85c4-4f01-8524-7b93c777592d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.348132 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.353908 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqrr\" (UniqueName: \"kubernetes.io/projected/275484db-b3bc-4027-a1d7-a67ab3c71439-kube-api-access-4rqrr\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.365258 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.374036 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrpqv\" (UniqueName: \"kubernetes.io/projected/b3570207-5cb9-4481-a15a-d0bb9312a84b-kube-api-access-xrpqv\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.375555 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.384921 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.404167 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.404679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7pps\" (UniqueName: \"kubernetes.io/projected/9d63bfb8-8ecc-43f3-8931-cc09c815c580-kube-api-access-k7pps\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.477238 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk"] Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.492059 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.537530 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xwc5v"] Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.430386 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.431067 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.431206 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.431421 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.434361 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.439218 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.439268 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.440717 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.440826 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.440870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.441108 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.442657 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.442682 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.452227 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:29.952195177 +0000 UTC m=+142.874008211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.539364 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podStartSLOduration=122.539333255 podStartE2EDuration="2m2.539333255s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:29.532193639 +0000 UTC m=+142.454006703" watchObservedRunningTime="2026-01-20 18:32:29.539333255 +0000 UTC m=+142.461146289" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543498 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543718 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs785\" (UniqueName: \"kubernetes.io/projected/00a9d467-1154-4eae-b1e5-19dfbb214a80-kube-api-access-cs785\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543775 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-metrics-certs\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.543860 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.043828576 +0000 UTC m=+142.965641610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00a9d467-1154-4eae-b1e5-19dfbb214a80-service-ca-bundle\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543968 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543994 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-stats-auth\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544029 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544185 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544311 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-default-certificate\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.545676 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.045653861 +0000 UTC m=+142.967466895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.546078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.553851 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.649189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.649491 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-metrics-certs\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.649561 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b703025-44fd-42d1-81fa-27ef31c9d2fb-cert\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.650151 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.150127657 +0000 UTC m=+143.071940691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-metrics-tls\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00a9d467-1154-4eae-b1e5-19dfbb214a80-service-ca-bundle\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650753 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12a1e676-da4c-46d2-a8f6-11dedde983fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-key\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650824 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-stats-auth\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650847 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-srv-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-mountpoint-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651028 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-images\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651069 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbfh\" (UniqueName: \"kubernetes.io/projected/22f987ee-958e-41a1-8cf4-ef0da8212364-kube-api-access-gbbfh\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651090 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvzks\" (UniqueName: \"kubernetes.io/projected/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-kube-api-access-pvzks\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpld\" (UniqueName: \"kubernetes.io/projected/90ec3f02-fbee-4465-b262-28b2b475e2b9-kube-api-access-4cpld\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651125 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f0fde2-da58-4350-ad67-cb29a2684875-config-volume\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651144 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651160 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmf4\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-kube-api-access-gvmf4\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651186 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx698\" (UniqueName: \"kubernetes.io/projected/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-kube-api-access-nx698\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651204 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx9mw\" (UniqueName: \"kubernetes.io/projected/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-kube-api-access-wx9mw\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71dd00dd-f11c-43a8-b7a2-2416a1761d94-config\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1e676-da4c-46d2-a8f6-11dedde983fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651347 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4n8m\" (UniqueName: \"kubernetes.io/projected/71dd00dd-f11c-43a8-b7a2-2416a1761d94-kube-api-access-w4n8m\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.651811 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.151791957 +0000 UTC m=+143.073604981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652538 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-csi-data-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652579 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-certs\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652606 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a27e80d9-dea2-4e87-90c8-1c69288cfa55-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652655 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1e676-da4c-46d2-a8f6-11dedde983fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f0fde2-da58-4350-ad67-cb29a2684875-metrics-tls\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652796 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-default-certificate\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652953 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-webhook-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652999 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-profile-collector-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653020 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-registration-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-socket-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654021 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14f80aee-1c1f-4beb-a280-3ac021e920c9-tmpfs\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654042 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654087 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654118 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-srv-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpbxx\" (UniqueName: \"kubernetes.io/projected/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-kube-api-access-kpbxx\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.656426 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.656888 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-node-bootstrap-token\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.657618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.657975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.658011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.658040 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-plugins-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcmz\" (UniqueName: \"kubernetes.io/projected/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-kube-api-access-2fcmz\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzx2\" (UniqueName: \"kubernetes.io/projected/14f80aee-1c1f-4beb-a280-3ac021e920c9-kube-api-access-gvzx2\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659649 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfqbt\" (UniqueName: \"kubernetes.io/projected/027ba59d-f4ba-430f-af60-a7f293dd2052-kube-api-access-wfqbt\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659891 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86dq\" (UniqueName: \"kubernetes.io/projected/c5d6700e-54f1-4f09-83d7-e85f66af8c85-kube-api-access-t86dq\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.660366 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.662330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56md4\" (UniqueName: \"kubernetes.io/projected/a27e80d9-dea2-4e87-90c8-1c69288cfa55-kube-api-access-56md4\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664017 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-trusted-ca\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664460 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r74z\" (UniqueName: \"kubernetes.io/projected/86f0fde2-da58-4350-ad67-cb29a2684875-kube-api-access-4r74z\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71dd00dd-f11c-43a8-b7a2-2416a1761d94-serving-cert\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665273 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8nd\" (UniqueName: \"kubernetes.io/projected/9cee99f1-8905-4089-be36-90af1426d834-kube-api-access-fc8nd\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-proxy-tls\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665337 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92kdg\" (UniqueName: \"kubernetes.io/projected/2b703025-44fd-42d1-81fa-27ef31c9d2fb-kube-api-access-92kdg\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-cabundle\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665468 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cee99f1-8905-4089-be36-90af1426d834-proxy-tls\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.666497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs785\" (UniqueName: \"kubernetes.io/projected/00a9d467-1154-4eae-b1e5-19dfbb214a80-kube-api-access-cs785\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.678154 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.695994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00a9d467-1154-4eae-b1e5-19dfbb214a80-service-ca-bundle\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.696974 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-metrics-certs\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.698020 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.698400 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-stats-auth\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.698659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-default-certificate\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.701219 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs785\" (UniqueName: \"kubernetes.io/projected/00a9d467-1154-4eae-b1e5-19dfbb214a80-kube-api-access-cs785\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.708917 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podStartSLOduration=123.708834413 podStartE2EDuration="2m3.708834413s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:29.706451715 +0000 UTC m=+142.628264779" watchObservedRunningTime="2026-01-20 18:32:29.708834413 +0000 UTC m=+142.630647457" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.772392 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.772850 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.27282677 +0000 UTC m=+143.194639794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.772960 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-srv-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-mountpoint-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773090 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-images\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbfh\" (UniqueName: \"kubernetes.io/projected/22f987ee-958e-41a1-8cf4-ef0da8212364-kube-api-access-gbbfh\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f0fde2-da58-4350-ad67-cb29a2684875-config-volume\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773173 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvzks\" (UniqueName: \"kubernetes.io/projected/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-kube-api-access-pvzks\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cpld\" (UniqueName: \"kubernetes.io/projected/90ec3f02-fbee-4465-b262-28b2b475e2b9-kube-api-access-4cpld\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773254 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmf4\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-kube-api-access-gvmf4\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx9mw\" (UniqueName: \"kubernetes.io/projected/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-kube-api-access-wx9mw\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx698\" (UniqueName: \"kubernetes.io/projected/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-kube-api-access-nx698\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71dd00dd-f11c-43a8-b7a2-2416a1761d94-config\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773347 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1e676-da4c-46d2-a8f6-11dedde983fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4n8m\" (UniqueName: \"kubernetes.io/projected/71dd00dd-f11c-43a8-b7a2-2416a1761d94-kube-api-access-w4n8m\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773405 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-csi-data-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1e676-da4c-46d2-a8f6-11dedde983fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f0fde2-da58-4350-ad67-cb29a2684875-metrics-tls\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-certs\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a27e80d9-dea2-4e87-90c8-1c69288cfa55-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773511 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-webhook-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773546 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-registration-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-profile-collector-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-socket-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773607 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773642 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14f80aee-1c1f-4beb-a280-3ac021e920c9-tmpfs\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpbxx\" (UniqueName: \"kubernetes.io/projected/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-kube-api-access-kpbxx\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-srv-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-node-bootstrap-token\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773741 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-plugins-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773776 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773797 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773819 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773838 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcmz\" (UniqueName: \"kubernetes.io/projected/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-kube-api-access-2fcmz\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773865 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773879 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzx2\" (UniqueName: \"kubernetes.io/projected/14f80aee-1c1f-4beb-a280-3ac021e920c9-kube-api-access-gvzx2\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfqbt\" (UniqueName: \"kubernetes.io/projected/027ba59d-f4ba-430f-af60-a7f293dd2052-kube-api-access-wfqbt\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773917 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86dq\" (UniqueName: \"kubernetes.io/projected/c5d6700e-54f1-4f09-83d7-e85f66af8c85-kube-api-access-t86dq\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56md4\" (UniqueName: \"kubernetes.io/projected/a27e80d9-dea2-4e87-90c8-1c69288cfa55-kube-api-access-56md4\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-trusted-ca\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773981 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r74z\" (UniqueName: \"kubernetes.io/projected/86f0fde2-da58-4350-ad67-cb29a2684875-kube-api-access-4r74z\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773997 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71dd00dd-f11c-43a8-b7a2-2416a1761d94-serving-cert\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774013 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774030 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-proxy-tls\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92kdg\" (UniqueName: \"kubernetes.io/projected/2b703025-44fd-42d1-81fa-27ef31c9d2fb-kube-api-access-92kdg\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774065 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774085 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8nd\" (UniqueName: \"kubernetes.io/projected/9cee99f1-8905-4089-be36-90af1426d834-kube-api-access-fc8nd\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774104 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-cabundle\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cee99f1-8905-4089-be36-90af1426d834-proxy-tls\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774133 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774190 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-metrics-tls\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774205 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b703025-44fd-42d1-81fa-27ef31c9d2fb-cert\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12a1e676-da4c-46d2-a8f6-11dedde983fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-key\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-mountpoint-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.775265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-images\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.775495 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.275486346 +0000 UTC m=+143.197299360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.776434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.777035 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-socket-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.777118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-registration-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.777784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f0fde2-da58-4350-ad67-cb29a2684875-config-volume\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.778167 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-plugins-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.785670 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.787517 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-csi-data-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.787687 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-node-bootstrap-token\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.788985 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-cabundle\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.791649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71dd00dd-f11c-43a8-b7a2-2416a1761d94-config\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.792110 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-webhook-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.792673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.793188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1e676-da4c-46d2-a8f6-11dedde983fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.793807 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a27e80d9-dea2-4e87-90c8-1c69288cfa55-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.794625 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-srv-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.795309 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.795717 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.795989 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f0fde2-da58-4350-ad67-cb29a2684875-metrics-tls\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.806690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-trusted-ca\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.810558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.811143 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.811576 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.811600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1e676-da4c-46d2-a8f6-11dedde983fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.812126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.812221 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.816269 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvzks\" (UniqueName: \"kubernetes.io/projected/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-kube-api-access-pvzks\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.818716 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71dd00dd-f11c-43a8-b7a2-2416a1761d94-serving-cert\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.819148 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-key\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.821951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.822906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-profile-collector-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.827706 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-srv-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.828984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-certs\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.820235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14f80aee-1c1f-4beb-a280-3ac021e920c9-tmpfs\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.834129 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.834142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx698\" (UniqueName: \"kubernetes.io/projected/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-kube-api-access-nx698\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.835651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cee99f1-8905-4089-be36-90af1426d834-proxy-tls\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.836235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx9mw\" (UniqueName: \"kubernetes.io/projected/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-kube-api-access-wx9mw\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.843259 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b703025-44fd-42d1-81fa-27ef31c9d2fb-cert\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.844966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-proxy-tls\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.856200 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbfh\" (UniqueName: \"kubernetes.io/projected/22f987ee-958e-41a1-8cf4-ef0da8212364-kube-api-access-gbbfh\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.856655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-metrics-tls\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.859724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cpld\" (UniqueName: \"kubernetes.io/projected/90ec3f02-fbee-4465-b262-28b2b475e2b9-kube-api-access-4cpld\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.874844 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.876594 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.376572578 +0000 UTC m=+143.298385602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.885876 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r74z\" (UniqueName: \"kubernetes.io/projected/86f0fde2-da58-4350-ad67-cb29a2684875-kube-api-access-4r74z\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.894940 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.895162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.930222 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.943326 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.953065 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.956702 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmf4\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-kube-api-access-gvmf4\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.957211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.965347 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.967191 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.970186 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.973153 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.992865 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.993250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.993650 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.493636674 +0000 UTC m=+143.415449698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.043190 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.050324 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.054986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpbxx\" (UniqueName: \"kubernetes.io/projected/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-kube-api-access-kpbxx\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.054920 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcmz\" (UniqueName: \"kubernetes.io/projected/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-kube-api-access-2fcmz\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.094452 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.095049 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.595027922 +0000 UTC m=+143.516840946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.109095 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4n8m\" (UniqueName: \"kubernetes.io/projected/71dd00dd-f11c-43a8-b7a2-2416a1761d94-kube-api-access-w4n8m\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.124561 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.126669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86dq\" (UniqueName: \"kubernetes.io/projected/c5d6700e-54f1-4f09-83d7-e85f66af8c85-kube-api-access-t86dq\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.129795 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfqbt\" (UniqueName: \"kubernetes.io/projected/027ba59d-f4ba-430f-af60-a7f293dd2052-kube-api-access-wfqbt\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.140908 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzx2\" (UniqueName: \"kubernetes.io/projected/14f80aee-1c1f-4beb-a280-3ac021e920c9-kube-api-access-gvzx2\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.145129 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56md4\" (UniqueName: \"kubernetes.io/projected/a27e80d9-dea2-4e87-90c8-1c69288cfa55-kube-api-access-56md4\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.145532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12a1e676-da4c-46d2-a8f6-11dedde983fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.152619 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8nd\" (UniqueName: \"kubernetes.io/projected/9cee99f1-8905-4089-be36-90af1426d834-kube-api-access-fc8nd\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.192061 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92kdg\" (UniqueName: \"kubernetes.io/projected/2b703025-44fd-42d1-81fa-27ef31c9d2fb-kube-api-access-92kdg\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.197714 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.198056 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.698042332 +0000 UTC m=+143.619855356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.226792 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.227249 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.290218 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.299643 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.299970 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.799953894 +0000 UTC m=+143.721766918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.307439 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.317453 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.323618 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.328486 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv"] Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.329706 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.361220 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.362030 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bkbfc"] Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.400704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.401045 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.901030816 +0000 UTC m=+143.822843840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.404493 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.415406 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.451532 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerStarted","Data":"4f519175ce1269f87277348e7dbf3cb7cac77cd634d740bc906a3ed7230ae289"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.459610 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" event={"ID":"a99225b3-64c7-4b39-807c-c97faa919977","Type":"ContainerStarted","Data":"33ccc5f46ace640fd63928e13135a80b2c741299077595b13ba47c85c98cc041"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.479201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" event={"ID":"330a8450-c400-425d-9a46-e868a02fca27","Type":"ContainerStarted","Data":"56724ec1814776d34f41e4bc888bb2f625416edf2dd917631378c1d20dace01d"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.501996 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.503199 4773 csr.go:261] certificate signing request csr-kd9nr is approved, waiting to be issued Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.504697 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.00467265 +0000 UTC m=+143.926485674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.514683 4773 csr.go:257] certificate signing request csr-kd9nr is issued Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.521276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" event={"ID":"6bae1b17-1679-4be9-9717-66c5a80ad425","Type":"ContainerStarted","Data":"c18851e3c7438f7238fd35df3adacffca533ec4c2ab48865a2762a6813ac4b07"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.521316 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" event={"ID":"6bae1b17-1679-4be9-9717-66c5a80ad425","Type":"ContainerStarted","Data":"d7718bdf724d16f5015d86421784af9c2854acd759c9fc1ebd71bab194622459"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.521782 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.540477 4773 generic.go:334] "Generic (PLEG): container finished" podID="49deabd4-ebbe-4c07-bb79-105982db000a" containerID="baa5478ba640cda543e6ae852f7964fed77445e700440fdfc2fde3fb0f5449ae" exitCode=0 Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.540551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerDied","Data":"baa5478ba640cda543e6ae852f7964fed77445e700440fdfc2fde3fb0f5449ae"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.547546 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x95ml" event={"ID":"00a9d467-1154-4eae-b1e5-19dfbb214a80","Type":"ContainerStarted","Data":"c0193781aec3b0e7b7da2b2e0e33d420b0bb0a76805b05f74d9b55ef584d52cb"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.553095 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" event={"ID":"411d251b-6daa-4c45-9aeb-aa38def60a90","Type":"ContainerStarted","Data":"6ca7a9b933f7fd2bb8b345e043d89d1ac113240c60a3aa639ec42e057c1be12b"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.558248 4773 patch_prober.go:28] interesting pod/console-operator-58897d9998-xwc5v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.558307 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" podUID="6bae1b17-1679-4be9-9717-66c5a80ad425" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.559946 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" event={"ID":"c5d6a7d8-1840-4f2c-9fee-694a671f28cd","Type":"ContainerStarted","Data":"e44f11bab386417c3b1877eca6ca0ea6e968d94a4f4dc89a6c32f3e76ff34c9d"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.559976 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" event={"ID":"c5d6a7d8-1840-4f2c-9fee-694a671f28cd","Type":"ContainerStarted","Data":"7a46a714094ecc49f930d8d4bd7c16a60cb470db4a63635254dca73bbbd15dc9"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.560915 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cv7zc" event={"ID":"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b","Type":"ContainerStarted","Data":"d02e93dbd4aa6093aa2aaed871798afaa3fde7066996fc90ff89276acd6fc5df"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.573461 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerStarted","Data":"e7d6c34d2f903961f01ee5fdb975fd359e3c9fc20a9e900b7c3df54dd10fd2d7"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.590698 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.616821 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.622220 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.122206768 +0000 UTC m=+144.044019782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.721686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.731440 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.231407399 +0000 UTC m=+144.153220423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.793654 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" podStartSLOduration=124.793631383 podStartE2EDuration="2m4.793631383s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.756075827 +0000 UTC m=+143.677888851" watchObservedRunningTime="2026-01-20 18:32:30.793631383 +0000 UTC m=+143.715444407" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.796512 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" podStartSLOduration=123.796495403 podStartE2EDuration="2m3.796495403s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.793407907 +0000 UTC m=+143.715220931" watchObservedRunningTime="2026-01-20 18:32:30.796495403 +0000 UTC m=+143.718308427" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.824869 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.825595 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.325470008 +0000 UTC m=+144.247283032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.905002 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" podStartSLOduration=124.904969688 podStartE2EDuration="2m4.904969688s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.892516281 +0000 UTC m=+143.814329305" watchObservedRunningTime="2026-01-20 18:32:30.904969688 +0000 UTC m=+143.826782732" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.905690 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" podStartSLOduration=124.905680655 podStartE2EDuration="2m4.905680655s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.841423681 +0000 UTC m=+143.763236725" watchObservedRunningTime="2026-01-20 18:32:30.905680655 +0000 UTC m=+143.827493679" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.926185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.927233 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.427214756 +0000 UTC m=+144.349027780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.029673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.030033 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.53002097 +0000 UTC m=+144.451833994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.131244 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.131836 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.631813509 +0000 UTC m=+144.553626533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.233892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.234569 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.734550651 +0000 UTC m=+144.656363685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.335886 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.336319 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.83629999 +0000 UTC m=+144.758113014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.438639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.441878 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.941861661 +0000 UTC m=+144.863674685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.520043 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-20 18:27:30 +0000 UTC, rotation deadline is 2026-10-16 23:47:11.334024711 +0000 UTC Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.520586 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6461h14m39.813441539s for next certificate rotation Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.540657 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.541161 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.041121588 +0000 UTC m=+144.962934622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638202 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cv7zc" event={"ID":"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b","Type":"ContainerStarted","Data":"9e6a94dcd69ec5878174d7da81e29ba694ed10cd4152d26082c28ea6eac57b95"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x95ml" event={"ID":"00a9d467-1154-4eae-b1e5-19dfbb214a80","Type":"ContainerStarted","Data":"b93bb8376800a790e2bc5f26d8a6290473ff1ef1cd0d8d69a65c430c24fec5a9"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638524 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerStarted","Data":"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638612 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.642538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.643099 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.143051551 +0000 UTC m=+145.064864805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.655539 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" event={"ID":"c5d6a7d8-1840-4f2c-9fee-694a671f28cd","Type":"ContainerStarted","Data":"eb06841b967a9ee52b36b5866eaa4628fa486a3492924ae7f602afbad102b81c"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.670069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bkbfc" event={"ID":"fec9cba4-b7cb-46ca-90a4-af0d5114fee8","Type":"ContainerStarted","Data":"92ed1cdedc72cff75793979290982d2a68870b42701fb4301823c687477e5622"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.670146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bkbfc" event={"ID":"fec9cba4-b7cb-46ca-90a4-af0d5114fee8","Type":"ContainerStarted","Data":"b37f5ddb904bef473f681c1f2ad91b1594e544b7305690cf7e2a0afd8ded483b"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.670688 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.673770 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.673860 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.675010 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerStarted","Data":"16aa457ee6765bc06137ae2471c969825987135af2a1af7ccb6ac13745a1cc94"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.681956 4773 generic.go:334] "Generic (PLEG): container finished" podID="275484db-b3bc-4027-a1d7-a67ab3c71439" containerID="14f73794a9b75752413743e6602360dfe037565dcc05bceef502d47eecf8267e" exitCode=0 Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.682080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" event={"ID":"275484db-b3bc-4027-a1d7-a67ab3c71439","Type":"ContainerDied","Data":"14f73794a9b75752413743e6602360dfe037565dcc05bceef502d47eecf8267e"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.682131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" event={"ID":"275484db-b3bc-4027-a1d7-a67ab3c71439","Type":"ContainerStarted","Data":"3c86239b4f8f8ff2b2d882db7a139f7059fdb529dde2ab6cd2d6fe31191f35b6"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.692575 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cv7zc" podStartSLOduration=6.692546791 podStartE2EDuration="6.692546791s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.662314726 +0000 UTC m=+144.584127750" watchObservedRunningTime="2026-01-20 18:32:31.692546791 +0000 UTC m=+144.614359815" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.694326 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-x95ml" podStartSLOduration=125.694320135 podStartE2EDuration="2m5.694320135s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.689548137 +0000 UTC m=+144.611361161" watchObservedRunningTime="2026-01-20 18:32:31.694320135 +0000 UTC m=+144.616133159" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.701430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerStarted","Data":"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.723754 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" podStartSLOduration=125.72373283 podStartE2EDuration="2m5.72373283s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.723532194 +0000 UTC m=+144.645345208" watchObservedRunningTime="2026-01-20 18:32:31.72373283 +0000 UTC m=+144.645545854" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.728743 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.750202 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.751828 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.251810012 +0000 UTC m=+145.173623036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.783640 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" podStartSLOduration=125.783616725 podStartE2EDuration="2m5.783616725s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.752463068 +0000 UTC m=+144.674276092" watchObservedRunningTime="2026-01-20 18:32:31.783616725 +0000 UTC m=+144.705429749" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.815749 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9nh6h" podStartSLOduration=125.815720767 podStartE2EDuration="2m5.815720767s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.812413726 +0000 UTC m=+144.734226760" watchObservedRunningTime="2026-01-20 18:32:31.815720767 +0000 UTC m=+144.737533791" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.852224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.857803 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.357787114 +0000 UTC m=+145.279600128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.879041 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bkbfc" podStartSLOduration=125.878999726 podStartE2EDuration="2m5.878999726s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.87874857 +0000 UTC m=+144.800561614" watchObservedRunningTime="2026-01-20 18:32:31.878999726 +0000 UTC m=+144.800812760" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.896283 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.904387 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:31 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:31 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:31 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.904452 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.959699 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.960012 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.459921481 +0000 UTC m=+145.381734515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.960375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.968412 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.46840289 +0000 UTC m=+145.390215914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.044312 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pfxs9"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.074486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.074918 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.574899285 +0000 UTC m=+145.496712299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.090115 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.115414 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.179619 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.180121 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.680103718 +0000 UTC m=+145.601916742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.223486 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wmrt"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.246177 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.258054 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.280610 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.281658 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.781201581 +0000 UTC m=+145.703014615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.310187 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b41ab6_6253_4ee5_87f2_50ed05610e03.slice/crio-304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf WatchSource:0}: Error finding container 304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf: Status 404 returned error can't find the container with id 304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.385002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.389203 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.889185883 +0000 UTC m=+145.810998907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.393552 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.422044 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.486871 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.535045 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.034958146 +0000 UTC m=+145.956771180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.590942 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.591572 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.09155228 +0000 UTC m=+146.013365304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.627628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.637073 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.692006 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.692269 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.192235242 +0000 UTC m=+146.114048266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.692812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.693664 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.193645787 +0000 UTC m=+146.115458811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.716069 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.722095 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" event={"ID":"bbce412e-616a-465b-bb42-da842edb8110","Type":"ContainerStarted","Data":"6f68abfc206ae3f46958c0e3edf9d6d16057f9c152b2c78a2d813bdd3b2789a1"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.723991 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.726509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" event={"ID":"b3570207-5cb9-4481-a15a-d0bb9312a84b","Type":"ContainerStarted","Data":"37b89ef034f836300d6b35b391a2842edbfa58a86d9c055d071138a9e217debd"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.726778 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.733434 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llx65"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.737209 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.743166 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.746703 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.747503 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.751950 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j8nw"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.752940 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.756643 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerStarted","Data":"e5688fa5e836d6693ad378718ac2d923e2bd4a94946bfe45ad6a19797eb22650"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.776782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" event={"ID":"275484db-b3bc-4027-a1d7-a67ab3c71439","Type":"ContainerStarted","Data":"eb7554d40f82e32096a8755820f9be85d890934dacb68219ab8675407e48ee64"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.777611 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.784268 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x6fwb"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.787116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerStarted","Data":"8177b1b86470c47017cdac6a443fe56f399d9dcba9662f954722c87a2522aa29"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.788086 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" podStartSLOduration=126.788057215 podStartE2EDuration="2m6.788057215s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.778068729 +0000 UTC m=+145.699881773" watchObservedRunningTime="2026-01-20 18:32:32.788057215 +0000 UTC m=+145.709870239" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.788434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" event={"ID":"65b41ab6-6253-4ee5-87f2-50ed05610e03","Type":"ContainerStarted","Data":"5e59468bd2c4f19972215ce899890309328c39e745f90c227579a59631ddbcbd"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.788464 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" event={"ID":"65b41ab6-6253-4ee5-87f2-50ed05610e03","Type":"ContainerStarted","Data":"304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.796230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.796744 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.296721778 +0000 UTC m=+146.218534802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.801418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" event={"ID":"47548d0b-9447-4862-b717-9427ae40c49a","Type":"ContainerStarted","Data":"be1d4d724e34c401ca901337c0c53e9b8b1dee0ad78ae08cd1264117b04a97b2"} Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.810219 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63fd2de1_85c4_4f01_8524_7b93c777592d.slice/crio-9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f WatchSource:0}: Error finding container 9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f: Status 404 returned error can't find the container with id 9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.810773 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" event={"ID":"79162e32-ee8c-4fcc-8911-0f95d41cd110","Type":"ContainerStarted","Data":"6549422212c40d6874303175493a48597f52b1121344c7b0fe26e9f5f7c50976"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.810824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" event={"ID":"79162e32-ee8c-4fcc-8911-0f95d41cd110","Type":"ContainerStarted","Data":"bef50359537c89305c7d074b823ddef915458135e59aeb4cfeff10c7cba90d87"} Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.813653 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f7552ac_b3a0_4bfa_ab3e_34e46ed83cff.slice/crio-80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8 WatchSource:0}: Error finding container 80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8: Status 404 returned error can't find the container with id 80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8 Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.817655 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e5ac136_d46c_45e3_9a5f_548ac22fac5c.slice/crio-d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f WatchSource:0}: Error finding container d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f: Status 404 returned error can't find the container with id d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.817907 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" event={"ID":"9d63bfb8-8ecc-43f3-8931-cc09c815c580","Type":"ContainerStarted","Data":"6e8ea7b775620d0f97028008ad9efc7b0d9cd7ffbc15b9ce92c518b5ca3c147a"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.818420 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" podStartSLOduration=126.818400952 podStartE2EDuration="2m6.818400952s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.801901426 +0000 UTC m=+145.723714450" watchObservedRunningTime="2026-01-20 18:32:32.818400952 +0000 UTC m=+145.740213976" Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.820871 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda27e80d9_dea2_4e87_90c8_1c69288cfa55.slice/crio-7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a WatchSource:0}: Error finding container 7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a: Status 404 returned error can't find the container with id 7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.827462 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" event={"ID":"90ec3f02-fbee-4465-b262-28b2b475e2b9","Type":"ContainerStarted","Data":"886a0cae110cb0fdcc1f14e44c311e7eca71d1c6f9627d542047ac3dbd222b51"} Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.835455 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cee99f1_8905_4089_be36_90af1426d834.slice/crio-da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3 WatchSource:0}: Error finding container da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3: Status 404 returned error can't find the container with id da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3 Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.846987 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" podStartSLOduration=126.846824523 podStartE2EDuration="2m6.846824523s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.818917006 +0000 UTC m=+145.740730030" watchObservedRunningTime="2026-01-20 18:32:32.846824523 +0000 UTC m=+145.768637568" Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.847374 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f987ee_958e_41a1_8cf4_ef0da8212364.slice/crio-0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a WatchSource:0}: Error finding container 0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a: Status 404 returned error can't find the container with id 0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.847902 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f0fde2_da58_4350_ad67_cb29a2684875.slice/crio-7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0 WatchSource:0}: Error finding container 7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0: Status 404 returned error can't find the container with id 7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0 Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.848469 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" podStartSLOduration=126.848461614 podStartE2EDuration="2m6.848461614s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.845670355 +0000 UTC m=+145.767483379" watchObservedRunningTime="2026-01-20 18:32:32.848461614 +0000 UTC m=+145.770274648" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.856083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" event={"ID":"e98bf97b-784a-4a99-8eff-20e6fc687876","Type":"ContainerStarted","Data":"13f6dc0568832cade7ae3139c8359ec8edc3489647d922a3ad2b60d581bea75f"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.856187 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" event={"ID":"e98bf97b-784a-4a99-8eff-20e6fc687876","Type":"ContainerStarted","Data":"aa10bed1ca3658e404efc58b50b8aff8c7ffe8e7a9f8da7cf9295153cbc28cfc"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.856223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" event={"ID":"e98bf97b-784a-4a99-8eff-20e6fc687876","Type":"ContainerStarted","Data":"c9fc379b8739fb28bddca6604cd4b3e92eb8915c5ef4df88ed28b9d99e73f96f"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.857372 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.857416 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.887820 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" podStartSLOduration=126.887798483 podStartE2EDuration="2m6.887798483s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.866055497 +0000 UTC m=+145.787868521" watchObservedRunningTime="2026-01-20 18:32:32.887798483 +0000 UTC m=+145.809611507" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.894056 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ks6ps"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.898540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.900308 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:32 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:32 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:32 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.900361 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.901637 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8vpz"] Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.902760 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.402742182 +0000 UTC m=+146.324555416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.903389 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" podStartSLOduration=125.903371187 podStartE2EDuration="2m5.903371187s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.901919521 +0000 UTC m=+145.823732545" watchObservedRunningTime="2026-01-20 18:32:32.903371187 +0000 UTC m=+145.825184201" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.923918 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5"] Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.926348 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5d6700e_54f1_4f09_83d7_e85f66af8c85.slice/crio-6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c WatchSource:0}: Error finding container 6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c: Status 404 returned error can't find the container with id 6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.962827 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.966948 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.974880 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5t8h7"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.977782 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48"] Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.003200 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.004455 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.504428318 +0000 UTC m=+146.426241342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: W0120 18:32:33.049490 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71dd00dd_f11c_43a8_b7a2_2416a1761d94.slice/crio-5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8 WatchSource:0}: Error finding container 5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8: Status 404 returned error can't find the container with id 5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8 Jan 20 18:32:33 crc kubenswrapper[4773]: W0120 18:32:33.067416 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b703025_44fd_42d1_81fa_27ef31c9d2fb.slice/crio-0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd WatchSource:0}: Error finding container 0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd: Status 404 returned error can't find the container with id 0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.105616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.106217 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.606195907 +0000 UTC m=+146.528008931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.207251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.207875 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.70779144 +0000 UTC m=+146.629604464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.209520 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.210034 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.710016306 +0000 UTC m=+146.631829330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.310188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.310591 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.810571405 +0000 UTC m=+146.732384429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.413222 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.413561 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.913546782 +0000 UTC m=+146.835359806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.516601 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.516811 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.016782377 +0000 UTC m=+146.938595391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.517390 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.518997 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.018977192 +0000 UTC m=+146.940790216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.620904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.621329 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.121310704 +0000 UTC m=+147.043123728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.731327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.731715 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.231696765 +0000 UTC m=+147.153509789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.832947 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.833349 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.33332875 +0000 UTC m=+147.255141774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.883762 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerStarted","Data":"b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.904072 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:33 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:33 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:33 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.904142 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.907990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" event={"ID":"19da63c0-7e43-4bb0-a8fb-590722ea7cf2","Type":"ContainerStarted","Data":"7417bf00db4e4f95d66c7d334c5b590c0e25dfeb28f418d4ac9d15eea500769d"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.908773 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" event={"ID":"19da63c0-7e43-4bb0-a8fb-590722ea7cf2","Type":"ContainerStarted","Data":"13f2bc32e5beb9b77bdf1af9461f914ea5370008a6fb39fff9aacd066ab3cb85"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.934881 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.935288 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.435272893 +0000 UTC m=+147.357085917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.951310 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" event={"ID":"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c","Type":"ContainerStarted","Data":"ecc3b1f32a305d38b9fa2b15e94a8361bade9cbfbe54907a33fc42929d5a22b5"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.955572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" event={"ID":"22f987ee-958e-41a1-8cf4-ef0da8212364","Type":"ContainerStarted","Data":"0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.958704 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" podStartSLOduration=127.95867562 podStartE2EDuration="2m7.95867562s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:33.911519947 +0000 UTC m=+146.833332981" watchObservedRunningTime="2026-01-20 18:32:33.95867562 +0000 UTC m=+146.880488644" Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.961470 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3570207-5cb9-4481-a15a-d0bb9312a84b" containerID="acc62163aa86828a85c83f6cde38f155ecb7db429189e69c347af4d13ac3334c" exitCode=0 Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.961841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" event={"ID":"b3570207-5cb9-4481-a15a-d0bb9312a84b","Type":"ContainerDied","Data":"acc62163aa86828a85c83f6cde38f155ecb7db429189e69c347af4d13ac3334c"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.974223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5t8h7" event={"ID":"2b703025-44fd-42d1-81fa-27ef31c9d2fb","Type":"ContainerStarted","Data":"0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:33.998252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" event={"ID":"a27e80d9-dea2-4e87-90c8-1c69288cfa55","Type":"ContainerStarted","Data":"4560d5b5bd62f44e40075e15bc1aca9675032e7a34a4262a64615248a77a4c0e"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:33.998315 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" event={"ID":"a27e80d9-dea2-4e87-90c8-1c69288cfa55","Type":"ContainerStarted","Data":"7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.026201 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" podStartSLOduration=127.026181294 podStartE2EDuration="2m7.026181294s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:33.961196533 +0000 UTC m=+146.883009557" watchObservedRunningTime="2026-01-20 18:32:34.026181294 +0000 UTC m=+146.947994318" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.031791 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" event={"ID":"47548d0b-9447-4862-b717-9427ae40c49a","Type":"ContainerStarted","Data":"d3711cb68365d51583974da9debc837a3f2d76cc1d5e41a6590b6dd5f3ca9fcd"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.031841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" event={"ID":"47548d0b-9447-4862-b717-9427ae40c49a","Type":"ContainerStarted","Data":"9d375a9b6f2d15244e273f1939e83a40f6446af9f37f4c05cede485027c4a22a"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.035735 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.038415 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.538391975 +0000 UTC m=+147.460204999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.060650 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.081310 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" podStartSLOduration=128.081291242 podStartE2EDuration="2m8.081291242s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.08118353 +0000 UTC m=+147.002996554" watchObservedRunningTime="2026-01-20 18:32:34.081291242 +0000 UTC m=+147.003104266" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.111839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" event={"ID":"deccf4fe-9230-4e96-b16c-a2ed0d2235a7","Type":"ContainerStarted","Data":"fbbc916eabc58535725d8bb371bf1acc9123d4e5db6dbab70a5320c438b7a02e"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.140403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.140756 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.640740658 +0000 UTC m=+147.562553692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.161303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" event={"ID":"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d","Type":"ContainerStarted","Data":"3d689f35c459df7616e7c28634c131829324f5334f5fd88df6ba44ccf51bd89b"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.161367 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" event={"ID":"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d","Type":"ContainerStarted","Data":"f2e93ed028afe2ff83347e9ba0da520844ee9c4df8c7f334a9a10ce40fff9779"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.205819 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" event={"ID":"14f80aee-1c1f-4beb-a280-3ac021e920c9","Type":"ContainerStarted","Data":"6a02e35c63a3c3855177ab1c0f0237b22cd314d673e1879ed7f5bdc7a6515d44"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.229331 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" event={"ID":"1e5ac136-d46c-45e3-9a5f-548ac22fac5c","Type":"ContainerStarted","Data":"c901060e5d78a348e16c6ce46f203d3c42ba5fe26d5799a1170a9b4307b0a0a5"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.229390 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" event={"ID":"1e5ac136-d46c-45e3-9a5f-548ac22fac5c","Type":"ContainerStarted","Data":"d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.241324 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.241577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.242179 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.742139717 +0000 UTC m=+147.663952741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.252846 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" podStartSLOduration=127.252826751 podStartE2EDuration="2m7.252826751s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.252402221 +0000 UTC m=+147.174215245" watchObservedRunningTime="2026-01-20 18:32:34.252826751 +0000 UTC m=+147.174639775" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.262987 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.265734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerStarted","Data":"f4d91eb42c30324decc0123b0752b77625e1bfc343e356223cf0e111b47451d8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.339254 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" event={"ID":"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff","Type":"ContainerStarted","Data":"80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347160 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.350590 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.8505688 +0000 UTC m=+147.772381824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.351383 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.363165 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.380724 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" podStartSLOduration=128.380705213 podStartE2EDuration="2m8.380705213s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.380478757 +0000 UTC m=+147.302291791" watchObservedRunningTime="2026-01-20 18:32:34.380705213 +0000 UTC m=+147.302518237" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.386269 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.399346 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" event={"ID":"9cee99f1-8905-4089-be36-90af1426d834","Type":"ContainerStarted","Data":"013f38fc18d5f7cfc5471d629627ae4858db35adfd2628717ed1fbec88115930"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.399397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" event={"ID":"9cee99f1-8905-4089-be36-90af1426d834","Type":"ContainerStarted","Data":"da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.425989 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" podStartSLOduration=127.425967248 podStartE2EDuration="2m7.425967248s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.424544563 +0000 UTC m=+147.346357587" watchObservedRunningTime="2026-01-20 18:32:34.425967248 +0000 UTC m=+147.347780272" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.435252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" event={"ID":"90ec3f02-fbee-4465-b262-28b2b475e2b9","Type":"ContainerStarted","Data":"ec69934220ed5dec49af1bb5081abe0d905cbf101311d24364907de23320b3ec"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.437625 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.440950 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" event={"ID":"9d63bfb8-8ecc-43f3-8931-cc09c815c580","Type":"ContainerStarted","Data":"5c46838bf621378464901a678b09adcd44bc559135c828da641d2d0f7915bd1f"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.450538 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.455442 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.455551 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.955529378 +0000 UTC m=+147.877342402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.455460 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.455980 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.457663 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.957648509 +0000 UTC m=+147.879461533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.458772 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.473052 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" podStartSLOduration=127.473027518 podStartE2EDuration="2m7.473027518s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.468731353 +0000 UTC m=+147.390544397" watchObservedRunningTime="2026-01-20 18:32:34.473027518 +0000 UTC m=+147.394840532" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.474873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" event={"ID":"bbce412e-616a-465b-bb42-da842edb8110","Type":"ContainerStarted","Data":"9e9fb02fd7d190cfc9b3fc46ab9c3917f121fef946756245ad1e1fd86d322de3"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.514646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" event={"ID":"71dd00dd-f11c-43a8-b7a2-2416a1761d94","Type":"ContainerStarted","Data":"5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.554428 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j8nw" event={"ID":"86f0fde2-da58-4350-ad67-cb29a2684875","Type":"ContainerStarted","Data":"d5665263fb63f8073fee27a66d95411966848e896fb3af208c72940be2d28d3d"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.554855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j8nw" event={"ID":"86f0fde2-da58-4350-ad67-cb29a2684875","Type":"ContainerStarted","Data":"7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.558679 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.559082 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.059043569 +0000 UTC m=+147.980856583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.559395 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.563828 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.063819067 +0000 UTC m=+147.985632091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.590466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" event={"ID":"63fd2de1-85c4-4f01-8524-7b93c777592d","Type":"ContainerStarted","Data":"9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.596228 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" event={"ID":"027ba59d-f4ba-430f-af60-a7f293dd2052","Type":"ContainerStarted","Data":"ffbfbf1e74220cef3503469c917219539f335187a85eab929928a5894d7d61b9"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.612781 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" podStartSLOduration=128.612760563 podStartE2EDuration="2m8.612760563s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.55826452 +0000 UTC m=+147.480077544" watchObservedRunningTime="2026-01-20 18:32:34.612760563 +0000 UTC m=+147.534573587" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.639633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" event={"ID":"12a1e676-da4c-46d2-a8f6-11dedde983fc","Type":"ContainerStarted","Data":"6fb49a54bbe3a1f2afde080ae3bccd7e3d22704d369a1b735706e91e7a522ef8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.651968 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.657877 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" podStartSLOduration=127.657841304 podStartE2EDuration="2m7.657841304s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.61709099 +0000 UTC m=+147.538904014" watchObservedRunningTime="2026-01-20 18:32:34.657841304 +0000 UTC m=+147.579654328" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.659349 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" podStartSLOduration=127.659343561 podStartE2EDuration="2m7.659343561s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.658764087 +0000 UTC m=+147.580577111" watchObservedRunningTime="2026-01-20 18:32:34.659343561 +0000 UTC m=+147.581156585" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.660722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.662101 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.162086859 +0000 UTC m=+148.083899883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.686215 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.722694 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" podStartSLOduration=128.722652831 podStartE2EDuration="2m8.722652831s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.719730579 +0000 UTC m=+147.641543613" watchObservedRunningTime="2026-01-20 18:32:34.722652831 +0000 UTC m=+147.644465865" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.766977 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.769052 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.269038826 +0000 UTC m=+148.190851850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.786988 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" podStartSLOduration=128.786969487 podStartE2EDuration="2m8.786969487s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.782609059 +0000 UTC m=+147.704422083" watchObservedRunningTime="2026-01-20 18:32:34.786969487 +0000 UTC m=+147.708782511" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.877550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.878196 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.378163285 +0000 UTC m=+148.299976309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.878290 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.878990 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.378970645 +0000 UTC m=+148.300783669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.905654 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:34 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:34 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:34 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.905720 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.980680 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.980986 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.480949759 +0000 UTC m=+148.402762783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.981475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.981921 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.481905923 +0000 UTC m=+148.403718947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.088462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.088974 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.588954921 +0000 UTC m=+148.510767945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: W0120 18:32:35.174318 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a WatchSource:0}: Error finding container 6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a: Status 404 returned error can't find the container with id 6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.191731 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.192101 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.692085003 +0000 UTC m=+148.613898027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.292523 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.292813 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.792795816 +0000 UTC m=+148.714608840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.397638 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.398143 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.898123412 +0000 UTC m=+148.819936436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.501291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.501979 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.001960081 +0000 UTC m=+148.923773105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.603174 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.603572 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.103558216 +0000 UTC m=+149.025371240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.656886 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"1bb487111a8dd96d9bb7d6357c49ebcd2c07dc95ed7a149d32cf268ec426f1f7"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.659880 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerStarted","Data":"cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.660294 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.662237 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ff9dd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.662347 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.664689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" event={"ID":"22f987ee-958e-41a1-8cf4-ef0da8212364","Type":"ContainerStarted","Data":"3a4e2e7547bbac8361b30789e0bf6f41f905e9306b78f4aeb9f697d02916a7d3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.666363 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.681499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" event={"ID":"63fd2de1-85c4-4f01-8524-7b93c777592d","Type":"ContainerStarted","Data":"a58107d7046025e68c36fd3ac794e389787ab76dd3b2ba2a71da22a733fd9014"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.705618 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podStartSLOduration=128.705594 podStartE2EDuration="2m8.705594s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.705051937 +0000 UTC m=+148.626864951" watchObservedRunningTime="2026-01-20 18:32:35.705594 +0000 UTC m=+148.627407024" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.705683 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.706225 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.206186946 +0000 UTC m=+149.127999970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.706356 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.707123 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.207114789 +0000 UTC m=+149.128927813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.707680 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" event={"ID":"027ba59d-f4ba-430f-af60-a7f293dd2052","Type":"ContainerStarted","Data":"6be2580705d35127eefa7d98a68f1ae97b2c5c98239f0b1de2031188d3ede215"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.710213 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.716060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" event={"ID":"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff","Type":"ContainerStarted","Data":"235167a6baf46817b565ac809735811236307a81b400f55729f65b41a34149aa"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.726782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8527837d56e2e0feade0b967500c559dd5efbf8f453f98252ae068da6dc713b6"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.727081 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.731893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" event={"ID":"9cee99f1-8905-4089-be36-90af1426d834","Type":"ContainerStarted","Data":"e9f8a0c11637d7c7a68f1584b29bc6f9f43bbba40be0cb8895d09defdf1debc3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.732601 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" podStartSLOduration=128.732586187 podStartE2EDuration="2m8.732586187s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.729992712 +0000 UTC m=+148.651805746" watchObservedRunningTime="2026-01-20 18:32:35.732586187 +0000 UTC m=+148.654399211" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.748678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" event={"ID":"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d","Type":"ContainerStarted","Data":"3e690d88cdf9ecc5c7b10d1726ebdfbdf7a54c4b85cd9d25d0a921954ec5b160"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.757337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" event={"ID":"71dd00dd-f11c-43a8-b7a2-2416a1761d94","Type":"ContainerStarted","Data":"8a7e5f6ba35040f67099c393796e19b5f158800e1430665147c67fc1841f31d7"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.760990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" event={"ID":"b3570207-5cb9-4481-a15a-d0bb9312a84b","Type":"ContainerStarted","Data":"284a40c07863c1fc0a978e29e9a355f06ba6da0e33de4a144c16fdda2dfcd5e6"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.764666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5t8h7" event={"ID":"2b703025-44fd-42d1-81fa-27ef31c9d2fb","Type":"ContainerStarted","Data":"544ce484c16c23ac90d0d3e84093e1bddda3103b377fea1064e14541d2070f47"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.774393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" event={"ID":"14f80aee-1c1f-4beb-a280-3ac021e920c9","Type":"ContainerStarted","Data":"88c0c11d7a3e0537e2ce0b4862fb7565338b9de4fca992b73522f859e5ea0566"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.787108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.798199 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.798590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" event={"ID":"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c","Type":"ContainerStarted","Data":"0933c284d7c611f700bc01fc4a21c3a53a009d967aeaa6152bff05e4ff861b09"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.798629 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" event={"ID":"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c","Type":"ContainerStarted","Data":"4b88e3a17718a3484046c7281f9f6b73a5c0f1414fc06a97c38605c5dc305f8f"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.807318 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.808366 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.308328113 +0000 UTC m=+149.230141137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.825557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" event={"ID":"a27e80d9-dea2-4e87-90c8-1c69288cfa55","Type":"ContainerStarted","Data":"2e0601da8fcd6be67b257f2e252e16ebc95579345b6166da1bcbacd948648b69"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.826585 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.839066 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5t8h7" podStartSLOduration=10.839048091 podStartE2EDuration="10.839048091s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.805788181 +0000 UTC m=+148.727601205" watchObservedRunningTime="2026-01-20 18:32:35.839048091 +0000 UTC m=+148.760861115" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.841063 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" podStartSLOduration=128.84105722 podStartE2EDuration="2m8.84105722s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.838271251 +0000 UTC m=+148.760084285" watchObservedRunningTime="2026-01-20 18:32:35.84105722 +0000 UTC m=+148.762870244" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.848311 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f8cfc0915963aab530743e7ca948d4ce12478ca7b058865f0c1c7d7cfde75c13"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.848378 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f51e9c50643f110f0fb0190747fe3a71b30ac2564baedd3a85a4912b1ba90dcc"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.871466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j8nw" event={"ID":"86f0fde2-da58-4350-ad67-cb29a2684875","Type":"ContainerStarted","Data":"73acc141bfeded81d28a19fa832921c535eaa5b4ccb63932fa12854fe299f8fe"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.872217 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.880288 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" podStartSLOduration=128.880271396 podStartE2EDuration="2m8.880271396s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.879503958 +0000 UTC m=+148.801317002" watchObservedRunningTime="2026-01-20 18:32:35.880271396 +0000 UTC m=+148.802084420" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.887697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" event={"ID":"12a1e676-da4c-46d2-a8f6-11dedde983fc","Type":"ContainerStarted","Data":"ecaf9f598aa7773a51c68cc4550f1673d164814d391dfbca7c764660ba07de4d"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.910583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.911909 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:35 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:35 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:35 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.911985 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.912074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3799c786fbbb3d35b3d20344ab7186c953095791ebe703fac2ed60f73ba2d0d2"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.912987 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.913705 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.41369058 +0000 UTC m=+149.335503604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.915993 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" podStartSLOduration=128.915975407 podStartE2EDuration="2m8.915975407s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.913046485 +0000 UTC m=+148.834859509" watchObservedRunningTime="2026-01-20 18:32:35.915975407 +0000 UTC m=+148.837788431" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.939697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" event={"ID":"deccf4fe-9230-4e96-b16c-a2ed0d2235a7","Type":"ContainerStarted","Data":"ea57fe328031a1378494cf1db170040b0ff2e64e906493eda1cfe756eb2d3ac3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.939741 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" event={"ID":"deccf4fe-9230-4e96-b16c-a2ed0d2235a7","Type":"ContainerStarted","Data":"0e9ac154fb60cce9dec38ed497643a99c7f37c7e3a84d4ae23f36c220acc19cf"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.982677 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" podStartSLOduration=128.98265652 podStartE2EDuration="2m8.98265652s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.96035075 +0000 UTC m=+148.882163774" watchObservedRunningTime="2026-01-20 18:32:35.98265652 +0000 UTC m=+148.904469544" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.014661 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.017498 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.516964096 +0000 UTC m=+149.438777310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.127711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.128072 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.628058204 +0000 UTC m=+149.549871228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.131549 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7j8nw" podStartSLOduration=11.13151253 podStartE2EDuration="11.13151253s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:36.112803798 +0000 UTC m=+149.034616822" watchObservedRunningTime="2026-01-20 18:32:36.13151253 +0000 UTC m=+149.053325554" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.133677 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" podStartSLOduration=130.133661612 podStartE2EDuration="2m10.133661612s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:36.087068364 +0000 UTC m=+149.008881388" watchObservedRunningTime="2026-01-20 18:32:36.133661612 +0000 UTC m=+149.055474646" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.143628 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" podStartSLOduration=129.143600727 podStartE2EDuration="2m9.143600727s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:36.141020264 +0000 UTC m=+149.062833288" watchObservedRunningTime="2026-01-20 18:32:36.143600727 +0000 UTC m=+149.065413751" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.228584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.228874 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.728838758 +0000 UTC m=+149.650651782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.228985 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.229493 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.729482065 +0000 UTC m=+149.651295089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.235398 4773 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.330688 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.330888 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.830850043 +0000 UTC m=+149.752663067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.331073 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.331419 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.831411397 +0000 UTC m=+149.753224421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.432076 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.432213 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.932191351 +0000 UTC m=+149.854004375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.433236 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.433719 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.933697938 +0000 UTC m=+149.855510962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.535173 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.535387 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:37.035352134 +0000 UTC m=+149.957165158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.535448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.535750 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:37.035742253 +0000 UTC m=+149.957555277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.636834 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.637339 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:37.137315247 +0000 UTC m=+150.059128271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.721378 4773 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-20T18:32:36.235420271Z","Handler":null,"Name":""} Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.724710 4773 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.724780 4773 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.738908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.743750 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.743834 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.761834 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.766669 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.776209 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.790810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.805307 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841062 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841422 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841523 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.899732 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:36 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:36 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:36 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.899800 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.936769 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.942641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.942719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.942808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.943392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.952638 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.953906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.955333 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.971800 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.980184 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.988067 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.006198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"77e12e08989d9253b8a6267df61c0f6a5a727483d295c1cf467ac4bbadb22f02"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.006278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"080c05ea441ebe9478b12dd279af6b3d5baaaf1a12a7630d123a1f146fd52f26"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.007787 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d3264e96f8657eb59542a703556e4f7626c7a9b9d060f39da3fe958c6025cc4d"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.013073 4773 generic.go:334] "Generic (PLEG): container finished" podID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerID="b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5" exitCode=0 Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.013424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerDied","Data":"b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.014420 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ff9dd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.014474 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.017817 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.044670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.044753 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.044869 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.063782 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.081627 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" podStartSLOduration=12.081603298 podStartE2EDuration="12.081603298s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:37.05729286 +0000 UTC m=+149.979105904" watchObservedRunningTime="2026-01-20 18:32:37.081603298 +0000 UTC m=+150.003416322" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.128534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.141559 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.141599 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.146437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.146580 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.146719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.149079 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.152181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.153061 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.155035 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.155144 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.167313 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.181060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.248609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.248754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.248846 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.312369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.348603 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.350594 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.350664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.350738 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.351374 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.351492 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.354453 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.369193 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.373412 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.381323 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.457325 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.457428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.457859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.483642 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.532654 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.558839 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.558890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.558939 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.559406 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.559617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.571156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.577022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.748342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.807273 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:32:37 crc kubenswrapper[4773]: W0120 18:32:37.820754 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6759422_151d_4228_b7c7_848c3008fb52.slice/crio-c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51 WatchSource:0}: Error finding container c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51: Status 404 returned error can't find the container with id c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51 Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.838096 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.902639 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:37 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:37 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:37 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.902712 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.002429 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.024862 4773 generic.go:334] "Generic (PLEG): container finished" podID="074f367d-7a48-4046-a679-9a2d38111b8a" containerID="1f220958033235a6f9fe2c2b2ebf17e5764f53dc8958a6ac265dd5f47e11eb7e" exitCode=0 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.024968 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"1f220958033235a6f9fe2c2b2ebf17e5764f53dc8958a6ac265dd5f47e11eb7e"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.025002 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerStarted","Data":"f57839f90df36cf23471ecda170b0c2440316e257ee6cca520a35c728d5b16de"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.026685 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.028135 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6759422-151d-4228-b7c7-848c3008fb52" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" exitCode=0 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.028180 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.028197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerStarted","Data":"c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.033800 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"a9898240f848cc033229fced45abd6a80768873178710d5a8fc7ce6b46376824"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.038171 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerStarted","Data":"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.038239 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerStarted","Data":"4717aabd05ca8421c098accb226b89152753529be1fa867b484287b5c5a81ae7"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.038469 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.043341 4773 generic.go:334] "Generic (PLEG): container finished" podID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerID="2fb65d95b1dd9e1def202549dcf0c536be64e92ad04a8874773fb7a70a7be1b9" exitCode=0 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.044885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"2fb65d95b1dd9e1def202549dcf0c536be64e92ad04a8874773fb7a70a7be1b9"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.044910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerStarted","Data":"47261c669f243247e3360eb031003a9925a21eac9889414fbe72f5ed85389a71"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.056067 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.082513 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" podStartSLOduration=132.08249441 podStartE2EDuration="2m12.08249441s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:38.07920919 +0000 UTC m=+151.001022214" watchObservedRunningTime="2026-01-20 18:32:38.08249441 +0000 UTC m=+151.004307434" Jan 20 18:32:38 crc kubenswrapper[4773]: W0120 18:32:38.105426 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48e32f25_29eb_4ef0_892b_0da316c47e3d.slice/crio-6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839 WatchSource:0}: Error finding container 6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839: Status 404 returned error can't find the container with id 6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.266787 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.267167 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.270682 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.270712 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.442345 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.577426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.577514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.577540 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.579941 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume" (OuterVolumeSpecName: "config-volume") pod "007a1e5a-0e90-44d1-b19d-e92154fb6a3d" (UID: "007a1e5a-0e90-44d1-b19d-e92154fb6a3d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.583808 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "007a1e5a-0e90-44d1-b19d-e92154fb6a3d" (UID: "007a1e5a-0e90-44d1-b19d-e92154fb6a3d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.583924 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf" (OuterVolumeSpecName: "kube-api-access-zxslf") pod "007a1e5a-0e90-44d1-b19d-e92154fb6a3d" (UID: "007a1e5a-0e90-44d1-b19d-e92154fb6a3d"). InnerVolumeSpecName "kube-api-access-zxslf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.679610 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.679655 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.679674 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.901045 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:38 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:38 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:38 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.901107 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.941580 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:32:38 crc kubenswrapper[4773]: E0120 18:32:38.942026 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerName="collect-profiles" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.942040 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerName="collect-profiles" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.942197 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerName="collect-profiles" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.943138 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.946103 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.957567 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.050767 4773 generic.go:334] "Generic (PLEG): container finished" podID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerID="40513f62f7dee095227f397c99bc022e2d2616f999a31014e1b18abdc02ed257" exitCode=0 Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.050841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"40513f62f7dee095227f397c99bc022e2d2616f999a31014e1b18abdc02ed257"} Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.050925 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerStarted","Data":"6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839"} Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.053459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerDied","Data":"8177b1b86470c47017cdac6a443fe56f399d9dcba9662f954722c87a2522aa29"} Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.053507 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8177b1b86470c47017cdac6a443fe56f399d9dcba9662f954722c87a2522aa29" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.053514 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.085519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.085566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.085591 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.186969 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.187391 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.187673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.188032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.188027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.213761 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.259954 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.348835 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.349960 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.369318 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.433196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.434122 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.437150 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.437833 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.438143 4773 patch_prober.go:28] interesting pod/console-f9d7485db-9nh6h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.438219 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9nh6h" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.458581 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.493109 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.493333 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.493389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.595578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.595681 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.595716 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.596285 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.596313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.649983 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.713202 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.747597 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:32:39 crc kubenswrapper[4773]: W0120 18:32:39.796581 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e2e310_70dc_4eb9_94f3_2e4466e2b7d2.slice/crio-259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627 WatchSource:0}: Error finding container 259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627: Status 404 returned error can't find the container with id 259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627 Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.895501 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.901420 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.950609 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.951606 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.954863 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.960922 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.963788 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.017508 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.017588 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.017656 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.076148 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerStarted","Data":"259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627"} Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.082882 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.087455 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.105148 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.119389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.119504 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.119600 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.120607 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.120614 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.152275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: W0120 18:32:40.198081 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86872811_c0ef_45cc_949a_f88b07fca9b3.slice/crio-a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c WatchSource:0}: Error finding container a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c: Status 404 returned error can't find the container with id a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.296290 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.363522 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.364619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.412890 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.445434 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.446067 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.446091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.471702 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.472583 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.475133 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.481570 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.518254 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554134 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554161 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554177 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554197 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.556359 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.556373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.577298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.658113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.658206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.659484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.677612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.698914 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.834816 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.888304 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:32:40 crc kubenswrapper[4773]: W0120 18:32:40.898605 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fd624a_2fa6_4887_83e0_779057846c71.slice/crio-d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333 WatchSource:0}: Error finding container d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333: Status 404 returned error can't find the container with id d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333 Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.094942 4773 generic.go:334] "Generic (PLEG): container finished" podID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" exitCode=0 Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.095223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.095541 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerStarted","Data":"a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.101769 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerID="ca09fdf16fb11b2b53675f7f94bd271507c5f87bdae39648e8c76e9ebf18f6ca" exitCode=0 Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.101840 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"ca09fdf16fb11b2b53675f7f94bd271507c5f87bdae39648e8c76e9ebf18f6ca"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.105795 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerStarted","Data":"d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.176680 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.444292 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.122258 4773 generic.go:334] "Generic (PLEG): container finished" podID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" exitCode=0 Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.122376 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3"} Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.122953 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerStarted","Data":"c69c9f7c333b88336518be53c3fff5b901d87e988f1ab9a73e541d27f61cbb78"} Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.163504 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5fd624a-2fa6-4887-83e0-779057846c71" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" exitCode=0 Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.163647 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee"} Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.179690 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerStarted","Data":"5e300ccf36d41fad9538e9de1df6b6580ae8c4a4328fbc7a160605c014450e23"} Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.193260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerStarted","Data":"1813474b98755b6d4285fbe703e9f6c817d84d1be9dfe42206aaf32444a2447a"} Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.207771 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.207735475 podStartE2EDuration="3.207735475s" podCreationTimestamp="2026-01-20 18:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:43.204465154 +0000 UTC m=+156.126278178" watchObservedRunningTime="2026-01-20 18:32:43.207735475 +0000 UTC m=+156.129548499" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.434293 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.435338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.438129 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.438374 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.439009 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.558955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.559015 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.660234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.660367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.660382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.680837 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.779464 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:44 crc kubenswrapper[4773]: I0120 18:32:44.126771 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 18:32:44 crc kubenswrapper[4773]: I0120 18:32:44.212870 4773 generic.go:334] "Generic (PLEG): container finished" podID="f2c577af-d1a7-40f0-99be-340543255117" containerID="1813474b98755b6d4285fbe703e9f6c817d84d1be9dfe42206aaf32444a2447a" exitCode=0 Jan 20 18:32:44 crc kubenswrapper[4773]: I0120 18:32:44.212945 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerDied","Data":"1813474b98755b6d4285fbe703e9f6c817d84d1be9dfe42206aaf32444a2447a"} Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.135580 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.250455 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerStarted","Data":"e5bf3229f1758a173d843e0afb161feaad40c4593219de0d8f49226d70963644"} Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.250506 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerStarted","Data":"67bfb3a0380f856f094f0e1a7bc5ba8821aa008bcf260d7de7488aff580b4c46"} Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.554552 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.724059 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"f2c577af-d1a7-40f0-99be-340543255117\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.725155 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"f2c577af-d1a7-40f0-99be-340543255117\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.725266 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f2c577af-d1a7-40f0-99be-340543255117" (UID: "f2c577af-d1a7-40f0-99be-340543255117"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.725786 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.738735 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f2c577af-d1a7-40f0-99be-340543255117" (UID: "f2c577af-d1a7-40f0-99be-340543255117"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.832223 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.281047 4773 generic.go:334] "Generic (PLEG): container finished" podID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerID="e5bf3229f1758a173d843e0afb161feaad40c4593219de0d8f49226d70963644" exitCode=0 Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.281132 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerDied","Data":"e5bf3229f1758a173d843e0afb161feaad40c4593219de0d8f49226d70963644"} Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.289053 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.289036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerDied","Data":"5e300ccf36d41fad9538e9de1df6b6580ae8c4a4328fbc7a160605c014450e23"} Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.289198 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e300ccf36d41fad9538e9de1df6b6580ae8c4a4328fbc7a160605c014450e23" Jan 20 18:32:48 crc kubenswrapper[4773]: I0120 18:32:48.280162 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:48 crc kubenswrapper[4773]: I0120 18:32:48.915704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:48 crc kubenswrapper[4773]: I0120 18:32:48.937973 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:49 crc kubenswrapper[4773]: I0120 18:32:49.064914 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:49 crc kubenswrapper[4773]: I0120 18:32:49.437340 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:49 crc kubenswrapper[4773]: I0120 18:32:49.441758 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.317979 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.318725 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" containerID="cri-o://0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122" gracePeriod=30 Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.345320 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.345628 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" containerID="cri-o://bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade" gracePeriod=30 Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.368147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerDied","Data":"67bfb3a0380f856f094f0e1a7bc5ba8821aa008bcf260d7de7488aff580b4c46"} Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.368203 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67bfb3a0380f856f094f0e1a7bc5ba8821aa008bcf260d7de7488aff580b4c46" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.396518 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497263 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497354 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" (UID: "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497582 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.375191 4773 generic.go:334] "Generic (PLEG): container finished" podID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerID="bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade" exitCode=0 Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.375280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerDied","Data":"bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade"} Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.376876 4773 generic.go:334] "Generic (PLEG): container finished" podID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerID="0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122" exitCode=0 Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.376961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerDied","Data":"0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122"} Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.376965 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.861511 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.861913 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.919904 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.920000 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 20 18:32:57 crc kubenswrapper[4773]: I0120 18:32:57.025875 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.171081 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.171360 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.899699 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" (UID: "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.966815 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:03 crc kubenswrapper[4773]: E0120 18:33:03.322750 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3794576891/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 18:33:03 crc kubenswrapper[4773]: E0120 18:33:03.323343 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dppcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-w4bcd_openshift-marketplace(c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3794576891/2\": happened during read: context canceled" logger="UnhandledError" Jan 20 18:33:03 crc kubenswrapper[4773]: E0120 18:33:03.325088 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3794576891/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" Jan 20 18:33:04 crc kubenswrapper[4773]: E0120 18:33:04.833228 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" Jan 20 18:33:06 crc kubenswrapper[4773]: E0120 18:33:06.447879 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 20 18:33:06 crc kubenswrapper[4773]: E0120 18:33:06.448569 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fbj8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v75d6_openshift-marketplace(074f367d-7a48-4046-a679-9a2d38111b8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 18:33:06 crc kubenswrapper[4773]: E0120 18:33:06.449802 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v75d6" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.862118 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.862727 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.920583 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.920739 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.030737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v75d6" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.111564 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.139502 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.139970 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.139989 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.140010 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140018 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.140025 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c577af-d1a7-40f0-99be-340543255117" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140032 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c577af-d1a7-40f0-99be-340543255117" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140256 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140276 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140286 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c577af-d1a7-40f0-99be-340543255117" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.142833 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.154486 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.191259 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226449 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226581 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.227971 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.228250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config" (OuterVolumeSpecName: "config") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.235259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.240071 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f" (OuterVolumeSpecName: "kube-api-access-jz48f") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "kube-api-access-jz48f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.306957 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331709 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331759 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331778 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331833 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332105 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332174 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332193 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332238 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332249 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332259 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332268 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.333334 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca" (OuterVolumeSpecName: "client-ca") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.334768 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config" (OuterVolumeSpecName: "config") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.335047 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.341256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8" (OuterVolumeSpecName: "kube-api-access-l2mg8") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "kube-api-access-l2mg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.341448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.389339 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jpbd"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.434892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.434984 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435017 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435093 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435105 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435116 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435129 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435143 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.436451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.436583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.439256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.457966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.471810 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.471823 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerDied","Data":"0be8a6611d182b19e3c280dba8bf32b32d7fa146a5cc6f6279d1419ba167e1bf"} Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.471917 4773 scope.go:117] "RemoveContainer" containerID="0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.473838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerDied","Data":"3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632"} Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.473968 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.502107 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.505614 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.505811 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.515316 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.517754 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:33:11 crc kubenswrapper[4773]: I0120 18:33:11.461036 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" path="/var/lib/kubelet/pods/1d49ef4e-91fb-4b98-89d9-65358c718967/volumes" Jan 20 18:33:11 crc kubenswrapper[4773]: I0120 18:33:11.461830 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" path="/var/lib/kubelet/pods/419120bb-3f1b-4f21-adf5-ac057bd5dce6/volumes" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.492392 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" event={"ID":"3791c4b7-dcef-470d-a67e-a2c0bb004436","Type":"ContainerStarted","Data":"bb794a0476596059002b2dbcd77c1dfb954fb595fb1c1ebf0af0fd989e74c7f0"} Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.806741 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:13 crc kubenswrapper[4773]: E0120 18:33:13.807270 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.807284 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.807372 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.807759 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.810786 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811164 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811337 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811493 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811631 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811800 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.830787 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.836980 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.887476 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927137 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927414 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.028988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029094 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029136 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029167 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.032630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.034261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.039666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.039690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.045321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.135983 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.691199 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.835121 4773 scope.go:117] "RemoveContainer" containerID="bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade" Jan 20 18:33:14 crc kubenswrapper[4773]: E0120 18:33:14.903066 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 18:33:14 crc kubenswrapper[4773]: E0120 18:33:14.903498 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9wnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wmbvt_openshift-marketplace(86872811-c0ef-45cc-949a-f88b07fca9b3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 18:33:14 crc kubenswrapper[4773]: E0120 18:33:14.904651 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wmbvt" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.086453 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.186421 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.505778 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerStarted","Data":"1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.508302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerStarted","Data":"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.510347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerStarted","Data":"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.513438 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerStarted","Data":"90446c16d64bd4019cc2e68d6478f0e637b6dc8de10aa866b35345c2046c26a3"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.520070 4773 generic.go:334] "Generic (PLEG): container finished" podID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerID="7365c0ca8367ca905e8c8072eaa82c4d3a22f3e79b23135efcc97f413135c4e2" exitCode=0 Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.520152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"7365c0ca8367ca905e8c8072eaa82c4d3a22f3e79b23135efcc97f413135c4e2"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.524959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" event={"ID":"3791c4b7-dcef-470d-a67e-a2c0bb004436","Type":"ContainerStarted","Data":"3eca4f59eed7f9fc182e2dae90196776347430effa57714afcf7767e9e086b4a"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.534699 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerStarted","Data":"12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.534758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerStarted","Data":"33124df33592983badc5fec6c563eafa6993c0b70e61eb8905604ffbda8d39c7"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.534963 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" containerID="cri-o://12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334" gracePeriod=30 Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.535456 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.546388 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerStarted","Data":"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138"} Jan 20 18:33:15 crc kubenswrapper[4773]: E0120 18:33:15.549905 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wmbvt" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.643533 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" podStartSLOduration=21.643511453 podStartE2EDuration="21.643511453s" podCreationTimestamp="2026-01-20 18:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:15.64300614 +0000 UTC m=+188.564819164" watchObservedRunningTime="2026-01-20 18:33:15.643511453 +0000 UTC m=+188.565324477" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.728487 4773 patch_prober.go:28] interesting pod/route-controller-manager-588997d685-k4wmn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:50614->10.217.0.54:8443: read: connection reset by peer" start-of-body= Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.731150 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:50614->10.217.0.54:8443: read: connection reset by peer" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.554430 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5fd624a-2fa6-4887-83e0-779057846c71" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.555048 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557174 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588997d685-k4wmn_7efe70f9-78a3-4abf-b920-03868c3f9041/route-controller-manager/0.log" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557225 4773 generic.go:334] "Generic (PLEG): container finished" podID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerID="12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334" exitCode=255 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerDied","Data":"12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557398 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerDied","Data":"33124df33592983badc5fec6c563eafa6993c0b70e61eb8905604ffbda8d39c7"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557410 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33124df33592983badc5fec6c563eafa6993c0b70e61eb8905604ffbda8d39c7" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.560766 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6759422-151d-4228-b7c7-848c3008fb52" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.560887 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.564367 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" event={"ID":"3791c4b7-dcef-470d-a67e-a2c0bb004436","Type":"ContainerStarted","Data":"038e90b2feb4b20bafc6c25c832f9bbbc4b1cda068561d44d6a6b1c77e3b7d7f"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.571347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerStarted","Data":"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.571630 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.576004 4773 generic.go:334] "Generic (PLEG): container finished" podID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerID="1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.576099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.579269 4773 generic.go:334] "Generic (PLEG): container finished" podID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.579327 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.579726 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.593908 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588997d685-k4wmn_7efe70f9-78a3-4abf-b920-03868c3f9041/route-controller-manager/0.log" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.594011 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.609610 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" podStartSLOduration=3.609577256 podStartE2EDuration="3.609577256s" podCreationTimestamp="2026-01-20 18:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:16.605167357 +0000 UTC m=+189.526980381" watchObservedRunningTime="2026-01-20 18:33:16.609577256 +0000 UTC m=+189.531390280" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.624756 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4jpbd" podStartSLOduration=170.624738409 podStartE2EDuration="2m50.624738409s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:16.620750842 +0000 UTC m=+189.542563866" watchObservedRunningTime="2026-01-20 18:33:16.624738409 +0000 UTC m=+189.546551433" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.708675 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:16 crc kubenswrapper[4773]: E0120 18:33:16.708900 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.708916 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.709037 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.709397 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788325 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788555 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.789421 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca" (OuterVolumeSpecName: "client-ca") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.789761 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config" (OuterVolumeSpecName: "config") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890241 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890713 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890743 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992006 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992059 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992127 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992198 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.993283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.994279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.008077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.676981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.679345 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.680001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp" (OuterVolumeSpecName: "kube-api-access-8bfqp") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "kube-api-access-8bfqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.698169 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.748363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.752767 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.780591 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.789905 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.822338 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.827495 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.929813 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.025142 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.027276 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.041419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.043236 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.048144 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.085675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.085805 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.187108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.187180 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.187301 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.237682 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.366211 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.612086 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 18:33:18 crc kubenswrapper[4773]: W0120 18:33:18.622180 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e6b5555_e3e7_43c5_8fdc_090dcfda09bc.slice/crio-296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510 WatchSource:0}: Error finding container 296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510: Status 404 returned error can't find the container with id 296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510 Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.701277 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:18 crc kubenswrapper[4773]: W0120 18:33:18.709913 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4033f4d4_e3aa_40fd_b5f3_558833a6846d.slice/crio-90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296 WatchSource:0}: Error finding container 90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296: Status 404 returned error can't find the container with id 90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296 Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.711596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerStarted","Data":"296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.455445 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" path="/var/lib/kubelet/pods/7efe70f9-78a3-4abf-b920-03868c3f9041/volumes" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.721783 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerStarted","Data":"17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.725277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerStarted","Data":"6c8071cb5501b88f60518fbf4d2fbb0ad26f5f049b5a5f7f02f1f0c839540f17"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.727308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerStarted","Data":"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.727352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerStarted","Data":"90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.727691 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.741946 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c8jjn" podStartSLOduration=3.620695945 podStartE2EDuration="42.741897325s" podCreationTimestamp="2026-01-20 18:32:37 +0000 UTC" firstStartedPulling="2026-01-20 18:32:39.052839029 +0000 UTC m=+151.974652053" lastFinishedPulling="2026-01-20 18:33:18.174040409 +0000 UTC m=+191.095853433" observedRunningTime="2026-01-20 18:33:19.741021853 +0000 UTC m=+192.662834877" watchObservedRunningTime="2026-01-20 18:33:19.741897325 +0000 UTC m=+192.663710349" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.762278 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" podStartSLOduration=6.762257446 podStartE2EDuration="6.762257446s" podCreationTimestamp="2026-01-20 18:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:19.760914793 +0000 UTC m=+192.682727817" watchObservedRunningTime="2026-01-20 18:33:19.762257446 +0000 UTC m=+192.684070470" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.778323 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.778305242 podStartE2EDuration="1.778305242s" podCreationTimestamp="2026-01-20 18:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:19.776288412 +0000 UTC m=+192.698101436" watchObservedRunningTime="2026-01-20 18:33:19.778305242 +0000 UTC m=+192.700118256" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.884997 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:20 crc kubenswrapper[4773]: I0120 18:33:20.734820 4773 generic.go:334] "Generic (PLEG): container finished" podID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerID="6c8071cb5501b88f60518fbf4d2fbb0ad26f5f049b5a5f7f02f1f0c839540f17" exitCode=0 Jan 20 18:33:20 crc kubenswrapper[4773]: I0120 18:33:20.734959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerDied","Data":"6c8071cb5501b88f60518fbf4d2fbb0ad26f5f049b5a5f7f02f1f0c839540f17"} Jan 20 18:33:21 crc kubenswrapper[4773]: I0120 18:33:21.742502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerStarted","Data":"042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d"} Jan 20 18:33:21 crc kubenswrapper[4773]: I0120 18:33:21.765640 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2qdpl" podStartSLOduration=3.336568662 podStartE2EDuration="45.765613998s" podCreationTimestamp="2026-01-20 18:32:36 +0000 UTC" firstStartedPulling="2026-01-20 18:32:38.048052181 +0000 UTC m=+150.969865205" lastFinishedPulling="2026-01-20 18:33:20.477097517 +0000 UTC m=+193.398910541" observedRunningTime="2026-01-20 18:33:21.762625895 +0000 UTC m=+194.684438919" watchObservedRunningTime="2026-01-20 18:33:21.765613998 +0000 UTC m=+194.687427022" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.507380 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.694793 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.695085 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" (UID: "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.695800 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.696075 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.699517 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" (UID: "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.753630 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerDied","Data":"296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510"} Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.753674 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.753684 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.797272 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.639776 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 18:33:26 crc kubenswrapper[4773]: E0120 18:33:26.640548 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerName="pruner" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.640571 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerName="pruner" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.640774 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerName="pruner" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.641487 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.645088 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.645415 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.646305 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.737016 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.737361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.737511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.860806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.976602 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.129212 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.129491 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.223009 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.752352 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.752979 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.820452 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.823768 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.882513 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.170136 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.170259 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.170403 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.171296 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.171566 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24" gracePeriod=600 Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.453087 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.604469 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 18:33:28 crc kubenswrapper[4773]: W0120 18:33:28.614012 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod29920243_d87d_49b3_9215_680935300c6e.slice/crio-425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917 WatchSource:0}: Error finding container 425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917: Status 404 returned error can't find the container with id 425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917 Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.781330 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerStarted","Data":"425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917"} Jan 20 18:33:29 crc kubenswrapper[4773]: I0120 18:33:29.788846 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24" exitCode=0 Jan 20 18:33:29 crc kubenswrapper[4773]: I0120 18:33:29.788914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24"} Jan 20 18:33:29 crc kubenswrapper[4773]: I0120 18:33:29.790662 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c8jjn" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" containerID="cri-o://17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560" gracePeriod=2 Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.797578 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerStarted","Data":"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.799876 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerStarted","Data":"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.803337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerStarted","Data":"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.804803 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerStarted","Data":"87629fae04c4c3f2e756aceedc76b8a1d45cb15dd8a81bfaef042af220c86cad"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.806783 4773 generic.go:334] "Generic (PLEG): container finished" podID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerID="17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560" exitCode=0 Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.806838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.808544 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerID="93be4aeabcc19519cd8451ce33f2117a534a92e9ae0b9b81378e69932d400b91" exitCode=0 Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.808575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"93be4aeabcc19519cd8451ce33f2117a534a92e9ae0b9b81378e69932d400b91"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.821306 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fm4ln" podStartSLOduration=9.12255177 podStartE2EDuration="51.821259024s" podCreationTimestamp="2026-01-20 18:32:39 +0000 UTC" firstStartedPulling="2026-01-20 18:32:42.175419128 +0000 UTC m=+155.097232152" lastFinishedPulling="2026-01-20 18:33:24.874126392 +0000 UTC m=+197.795939406" observedRunningTime="2026-01-20 18:33:30.817488887 +0000 UTC m=+203.739301911" watchObservedRunningTime="2026-01-20 18:33:30.821259024 +0000 UTC m=+203.743072048" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.384680 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.404543 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6lqws" podStartSLOduration=4.3001280170000005 podStartE2EDuration="54.404522848s" podCreationTimestamp="2026-01-20 18:32:37 +0000 UTC" firstStartedPulling="2026-01-20 18:32:38.030511719 +0000 UTC m=+150.952324733" lastFinishedPulling="2026-01-20 18:33:28.13490651 +0000 UTC m=+201.056719564" observedRunningTime="2026-01-20 18:33:30.839746086 +0000 UTC m=+203.761559110" watchObservedRunningTime="2026-01-20 18:33:31.404522848 +0000 UTC m=+204.326335872" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.524867 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"48e32f25-29eb-4ef0-892b-0da316c47e3d\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.524939 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"48e32f25-29eb-4ef0-892b-0da316c47e3d\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.524984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"48e32f25-29eb-4ef0-892b-0da316c47e3d\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.549473 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities" (OuterVolumeSpecName: "utilities") pod "48e32f25-29eb-4ef0-892b-0da316c47e3d" (UID: "48e32f25-29eb-4ef0-892b-0da316c47e3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.560205 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw" (OuterVolumeSpecName: "kube-api-access-gt4kw") pod "48e32f25-29eb-4ef0-892b-0da316c47e3d" (UID: "48e32f25-29eb-4ef0-892b-0da316c47e3d"). InnerVolumeSpecName "kube-api-access-gt4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.592698 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48e32f25-29eb-4ef0-892b-0da316c47e3d" (UID: "48e32f25-29eb-4ef0-892b-0da316c47e3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.626201 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.626232 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.626242 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.817266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.819221 4773 generic.go:334] "Generic (PLEG): container finished" podID="074f367d-7a48-4046-a679-9a2d38111b8a" containerID="639cb92015efe00db0ab47ee3303403c16b478767a9030340d303516dfaf2e8d" exitCode=0 Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.819276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"639cb92015efe00db0ab47ee3303403c16b478767a9030340d303516dfaf2e8d"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.822009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.822032 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.822088 4773 scope.go:117] "RemoveContainer" containerID="17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.824392 4773 generic.go:334] "Generic (PLEG): container finished" podID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" exitCode=0 Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.825085 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.849493 4773 scope.go:117] "RemoveContainer" containerID="7365c0ca8367ca905e8c8072eaa82c4d3a22f3e79b23135efcc97f413135c4e2" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.897165 4773 scope.go:117] "RemoveContainer" containerID="40513f62f7dee095227f397c99bc022e2d2616f999a31014e1b18abdc02ed257" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.972146 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.972128104 podStartE2EDuration="5.972128104s" podCreationTimestamp="2026-01-20 18:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:31.96846414 +0000 UTC m=+204.890277164" watchObservedRunningTime="2026-01-20 18:33:31.972128104 +0000 UTC m=+204.893941128" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.025590 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxsfk" podStartSLOduration=6.435064255 podStartE2EDuration="52.025570741s" podCreationTimestamp="2026-01-20 18:32:40 +0000 UTC" firstStartedPulling="2026-01-20 18:32:42.149969251 +0000 UTC m=+155.071782275" lastFinishedPulling="2026-01-20 18:33:27.740475697 +0000 UTC m=+200.662288761" observedRunningTime="2026-01-20 18:33:32.022140223 +0000 UTC m=+204.943953247" watchObservedRunningTime="2026-01-20 18:33:32.025570741 +0000 UTC m=+204.947383765" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.050971 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.053576 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.831179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerStarted","Data":"f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07"} Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.835901 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerStarted","Data":"20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41"} Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.839142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerStarted","Data":"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f"} Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.856229 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w4bcd" podStartSLOduration=3.749818164 podStartE2EDuration="54.856212287s" podCreationTimestamp="2026-01-20 18:32:38 +0000 UTC" firstStartedPulling="2026-01-20 18:32:41.147165033 +0000 UTC m=+154.068978057" lastFinishedPulling="2026-01-20 18:33:32.253559156 +0000 UTC m=+205.175372180" observedRunningTime="2026-01-20 18:33:32.854806821 +0000 UTC m=+205.776619845" watchObservedRunningTime="2026-01-20 18:33:32.856212287 +0000 UTC m=+205.778025311" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.875318 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wmbvt" podStartSLOduration=2.684900787 podStartE2EDuration="53.875298536s" podCreationTimestamp="2026-01-20 18:32:39 +0000 UTC" firstStartedPulling="2026-01-20 18:32:41.100772039 +0000 UTC m=+154.022585063" lastFinishedPulling="2026-01-20 18:33:32.291169768 +0000 UTC m=+205.212982812" observedRunningTime="2026-01-20 18:33:32.873196681 +0000 UTC m=+205.795009715" watchObservedRunningTime="2026-01-20 18:33:32.875298536 +0000 UTC m=+205.797111560" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.891603 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v75d6" podStartSLOduration=2.652243463 podStartE2EDuration="56.891579262s" podCreationTimestamp="2026-01-20 18:32:36 +0000 UTC" firstStartedPulling="2026-01-20 18:32:38.026421368 +0000 UTC m=+150.948234392" lastFinishedPulling="2026-01-20 18:33:32.265757167 +0000 UTC m=+205.187570191" observedRunningTime="2026-01-20 18:33:32.891259674 +0000 UTC m=+205.813072708" watchObservedRunningTime="2026-01-20 18:33:32.891579262 +0000 UTC m=+205.813392286" Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.454286 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" path="/var/lib/kubelet/pods/48e32f25-29eb-4ef0-892b-0da316c47e3d/volumes" Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.782241 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.782681 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" containerID="cri-o://919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" gracePeriod=30 Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.819203 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.819425 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" containerID="cri-o://9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" gracePeriod=30 Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.355563 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.429418 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.460787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.460854 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461198 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461243 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461270 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461794 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config" (OuterVolumeSpecName: "config") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461855 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca" (OuterVolumeSpecName: "client-ca") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462092 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config" (OuterVolumeSpecName: "config") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462655 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462672 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462681 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462690 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462702 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.466705 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np" (OuterVolumeSpecName: "kube-api-access-b94np") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "kube-api-access-b94np". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.466719 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.466744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5" (OuterVolumeSpecName: "kube-api-access-8jhp5") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "kube-api-access-8jhp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563358 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563592 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563613 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563627 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.566116 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.664314 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849422 4773 generic.go:334] "Generic (PLEG): container finished" podID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" exitCode=0 Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849486 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerDied","Data":"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849729 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerDied","Data":"90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849492 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849761 4773 scope.go:117] "RemoveContainer" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.851880 4773 generic.go:334] "Generic (PLEG): container finished" podID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" exitCode=0 Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.851919 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerDied","Data":"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.851962 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerDied","Data":"90446c16d64bd4019cc2e68d6478f0e637b6dc8de10aa866b35345c2046c26a3"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.852296 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.864665 4773 scope.go:117] "RemoveContainer" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" Jan 20 18:33:34 crc kubenswrapper[4773]: E0120 18:33:34.865551 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7\": container with ID starting with 9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7 not found: ID does not exist" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.865610 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7"} err="failed to get container status \"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7\": rpc error: code = NotFound desc = could not find container \"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7\": container with ID starting with 9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7 not found: ID does not exist" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.865644 4773 scope.go:117] "RemoveContainer" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.881291 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.885761 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.886150 4773 scope.go:117] "RemoveContainer" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" Jan 20 18:33:34 crc kubenswrapper[4773]: E0120 18:33:34.887311 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d\": container with ID starting with 919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d not found: ID does not exist" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.887356 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d"} err="failed to get container status \"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d\": rpc error: code = NotFound desc = could not find container \"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d\": container with ID starting with 919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d not found: ID does not exist" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.893531 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.895782 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.137352 4773 patch_prober.go:28] interesting pod/controller-manager-7b574fff6d-mnxtv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" start-of-body= Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.137644 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.454388 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" path="/var/lib/kubelet/pods/383b1abe-3796-4b98-bb28-515ce7eafd6b/volumes" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.455182 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" path="/var/lib/kubelet/pods/4033f4d4-e3aa-40fd-b5f3-558833a6846d/volumes" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598413 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598643 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598657 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598667 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-content" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598673 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-content" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598682 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598688 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598707 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-utilities" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598713 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-utilities" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598724 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598730 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598824 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598838 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598846 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.599277 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.602061 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.602248 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.603052 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.603314 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.604872 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.606874 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.614792 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.615828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.618736 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.619202 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.619428 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.619651 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.620084 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.620277 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.629383 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.634615 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.646596 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783313 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783571 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783594 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885291 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885339 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.886912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.886959 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.887483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.888265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.888300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.891560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.891728 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.900826 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.909610 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.930652 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.935812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.365969 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:36 crc kubenswrapper[4773]: W0120 18:33:36.374591 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd65ab1d5_14b1_4f38_a627_ca6f00bb0b44.slice/crio-c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634 WatchSource:0}: Error finding container c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634: Status 404 returned error can't find the container with id c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634 Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.423494 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.875156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerStarted","Data":"0d0c05a7620d0b2d084778129f9283d19b008663b280d13e07e0f66abacb3b84"} Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.876757 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerStarted","Data":"c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634"} Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.312879 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.313477 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.350230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.572219 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.572857 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.610356 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.883918 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerStarted","Data":"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660"} Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.884104 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.885891 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerStarted","Data":"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9"} Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.891844 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.906593 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" podStartSLOduration=4.906568064 podStartE2EDuration="4.906568064s" podCreationTimestamp="2026-01-20 18:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:37.900631892 +0000 UTC m=+210.822444916" watchObservedRunningTime="2026-01-20 18:33:37.906568064 +0000 UTC m=+210.828381098" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.932148 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.937558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.949392 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" podStartSLOduration=4.949367949 podStartE2EDuration="4.949367949s" podCreationTimestamp="2026-01-20 18:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:37.947407849 +0000 UTC m=+210.869220893" watchObservedRunningTime="2026-01-20 18:33:37.949367949 +0000 UTC m=+210.871180973" Jan 20 18:33:38 crc kubenswrapper[4773]: I0120 18:33:38.891826 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:38 crc kubenswrapper[4773]: I0120 18:33:38.899813 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.260530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.260793 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.304551 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.456431 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.714726 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.715133 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.754529 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.935064 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.942860 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.297491 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.298428 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.343357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.700289 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.700365 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.732568 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.901854 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6lqws" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" containerID="cri-o://cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" gracePeriod=2 Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.938078 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.944041 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.791197 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.850462 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908672 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6759422-151d-4228-b7c7-848c3008fb52" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" exitCode=0 Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908711 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908748 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b"} Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908829 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51"} Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908856 4773 scope.go:117] "RemoveContainer" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.909011 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wmbvt" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" containerID="cri-o://04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" gracePeriod=2 Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.927737 4773 scope.go:117] "RemoveContainer" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.941965 4773 scope.go:117] "RemoveContainer" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.963811 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"a6759422-151d-4228-b7c7-848c3008fb52\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.963887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"a6759422-151d-4228-b7c7-848c3008fb52\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.964051 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"a6759422-151d-4228-b7c7-848c3008fb52\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.965519 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities" (OuterVolumeSpecName: "utilities") pod "a6759422-151d-4228-b7c7-848c3008fb52" (UID: "a6759422-151d-4228-b7c7-848c3008fb52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.970529 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm" (OuterVolumeSpecName: "kube-api-access-pkldm") pod "a6759422-151d-4228-b7c7-848c3008fb52" (UID: "a6759422-151d-4228-b7c7-848c3008fb52"). InnerVolumeSpecName "kube-api-access-pkldm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.024497 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6759422-151d-4228-b7c7-848c3008fb52" (UID: "a6759422-151d-4228-b7c7-848c3008fb52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.056718 4773 scope.go:117] "RemoveContainer" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.057271 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b\": container with ID starting with cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b not found: ID does not exist" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057316 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b"} err="failed to get container status \"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b\": rpc error: code = NotFound desc = could not find container \"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b\": container with ID starting with cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057343 4773 scope.go:117] "RemoveContainer" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.057870 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138\": container with ID starting with 884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138 not found: ID does not exist" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057951 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138"} err="failed to get container status \"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138\": rpc error: code = NotFound desc = could not find container \"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138\": container with ID starting with 884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138 not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057979 4773 scope.go:117] "RemoveContainer" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.058495 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d\": container with ID starting with ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d not found: ID does not exist" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.058519 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d"} err="failed to get container status \"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d\": rpc error: code = NotFound desc = could not find container \"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d\": container with ID starting with ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.065795 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.065833 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.065848 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.233749 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.239738 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.300377 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.470209 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"86872811-c0ef-45cc-949a-f88b07fca9b3\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.470383 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"86872811-c0ef-45cc-949a-f88b07fca9b3\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.470578 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"86872811-c0ef-45cc-949a-f88b07fca9b3\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.471244 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities" (OuterVolumeSpecName: "utilities") pod "86872811-c0ef-45cc-949a-f88b07fca9b3" (UID: "86872811-c0ef-45cc-949a-f88b07fca9b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.473222 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc" (OuterVolumeSpecName: "kube-api-access-s9wnc") pod "86872811-c0ef-45cc-949a-f88b07fca9b3" (UID: "86872811-c0ef-45cc-949a-f88b07fca9b3"). InnerVolumeSpecName "kube-api-access-s9wnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.495001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86872811-c0ef-45cc-949a-f88b07fca9b3" (UID: "86872811-c0ef-45cc-949a-f88b07fca9b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.572404 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.572438 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.572448 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.854650 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" containerID="cri-o://87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" gracePeriod=15 Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917691 4773 generic.go:334] "Generic (PLEG): container finished" podID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" exitCode=0 Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917872 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f"} Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917941 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c"} Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917961 4773 scope.go:117] "RemoveContainer" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.930784 4773 scope.go:117] "RemoveContainer" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.948047 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.948447 4773 scope.go:117] "RemoveContainer" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.951357 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.960490 4773 scope.go:117] "RemoveContainer" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.960884 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f\": container with ID starting with 04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f not found: ID does not exist" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.960925 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f"} err="failed to get container status \"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f\": rpc error: code = NotFound desc = could not find container \"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f\": container with ID starting with 04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.960965 4773 scope.go:117] "RemoveContainer" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.962210 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8\": container with ID starting with cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8 not found: ID does not exist" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.962295 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8"} err="failed to get container status \"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8\": rpc error: code = NotFound desc = could not find container \"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8\": container with ID starting with cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8 not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.962368 4773 scope.go:117] "RemoveContainer" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.962847 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545\": container with ID starting with 0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545 not found: ID does not exist" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.962867 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545"} err="failed to get container status \"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545\": rpc error: code = NotFound desc = could not find container \"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545\": container with ID starting with 0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545 not found: ID does not exist" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.456118 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" path="/var/lib/kubelet/pods/86872811-c0ef-45cc-949a-f88b07fca9b3/volumes" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.457376 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6759422-151d-4228-b7c7-848c3008fb52" path="/var/lib/kubelet/pods/a6759422-151d-4228-b7c7-848c3008fb52/volumes" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.779686 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888044 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888094 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888119 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888168 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888203 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888236 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888447 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888464 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888502 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888524 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888547 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888735 4773 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.889718 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.889744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.890072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.890488 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.892322 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.893296 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft" (OuterVolumeSpecName: "kube-api-access-4p5ft") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "kube-api-access-4p5ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897047 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897475 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.901165 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927693 4773 generic.go:334] "Generic (PLEG): container finished" podID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" exitCode=0 Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927745 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927793 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerDied","Data":"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc"} Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927854 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerDied","Data":"4f519175ce1269f87277348e7dbf3cb7cac77cd634d740bc906a3ed7230ae289"} Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927883 4773 scope.go:117] "RemoveContainer" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.950476 4773 scope.go:117] "RemoveContainer" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" Jan 20 18:33:43 crc kubenswrapper[4773]: E0120 18:33:43.954534 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc\": container with ID starting with 87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc not found: ID does not exist" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.954572 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc"} err="failed to get container status \"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc\": rpc error: code = NotFound desc = could not find container \"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc\": container with ID starting with 87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc not found: ID does not exist" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.958859 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.961860 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989560 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989590 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989604 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989614 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989625 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989635 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989644 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989657 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989667 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989676 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989685 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989694 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989703 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.254082 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.255000 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kxsfk" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" containerID="cri-o://2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" gracePeriod=2 Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.703301 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.799660 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.799755 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.799785 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.800727 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities" (OuterVolumeSpecName: "utilities") pod "c19dbd84-8fec-4998-b2ae-65c68dee6b17" (UID: "c19dbd84-8fec-4998-b2ae-65c68dee6b17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.803982 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk" (OuterVolumeSpecName: "kube-api-access-txldk") pod "c19dbd84-8fec-4998-b2ae-65c68dee6b17" (UID: "c19dbd84-8fec-4998-b2ae-65c68dee6b17"). InnerVolumeSpecName "kube-api-access-txldk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.901286 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.901315 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.913199 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c19dbd84-8fec-4998-b2ae-65c68dee6b17" (UID: "c19dbd84-8fec-4998-b2ae-65c68dee6b17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.933965 4773 generic.go:334] "Generic (PLEG): container finished" podID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" exitCode=0 Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745"} Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"c69c9f7c333b88336518be53c3fff5b901d87e988f1ab9a73e541d27f61cbb78"} Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934126 4773 scope.go:117] "RemoveContainer" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934438 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.950179 4773 scope.go:117] "RemoveContainer" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.964202 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.967606 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.986180 4773 scope.go:117] "RemoveContainer" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.997380 4773 scope.go:117] "RemoveContainer" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" Jan 20 18:33:44 crc kubenswrapper[4773]: E0120 18:33:44.998366 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745\": container with ID starting with 2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745 not found: ID does not exist" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998423 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745"} err="failed to get container status \"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745\": rpc error: code = NotFound desc = could not find container \"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745\": container with ID starting with 2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745 not found: ID does not exist" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998469 4773 scope.go:117] "RemoveContainer" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" Jan 20 18:33:44 crc kubenswrapper[4773]: E0120 18:33:44.998758 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088\": container with ID starting with 2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088 not found: ID does not exist" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998784 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088"} err="failed to get container status \"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088\": rpc error: code = NotFound desc = could not find container \"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088\": container with ID starting with 2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088 not found: ID does not exist" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998799 4773 scope.go:117] "RemoveContainer" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" Jan 20 18:33:44 crc kubenswrapper[4773]: E0120 18:33:44.999037 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3\": container with ID starting with 7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3 not found: ID does not exist" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.999067 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3"} err="failed to get container status \"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3\": rpc error: code = NotFound desc = could not find container \"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3\": container with ID starting with 7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3 not found: ID does not exist" Jan 20 18:33:45 crc kubenswrapper[4773]: I0120 18:33:45.002694 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:45 crc kubenswrapper[4773]: I0120 18:33:45.456242 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" path="/var/lib/kubelet/pods/c19dbd84-8fec-4998-b2ae-65c68dee6b17/volumes" Jan 20 18:33:45 crc kubenswrapper[4773]: I0120 18:33:45.457788 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" path="/var/lib/kubelet/pods/cf25ec9b-96c5-4129-958f-35acbc34a20d/volumes" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610075 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-z5t98"] Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610745 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610757 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610768 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610774 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610788 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610794 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610804 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610809 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610818 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610823 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610831 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610836 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610842 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610849 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610858 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610864 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610873 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610878 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610887 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610892 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610998 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611018 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611027 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611037 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611525 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614341 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614405 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614844 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615047 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615084 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615232 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615438 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615613 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.616481 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.616731 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.619356 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.627147 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-z5t98"] Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.639842 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.639869 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.648516 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714521 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-dir\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-policies\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqx6\" (UniqueName: \"kubernetes.io/projected/dcdf6037-c51f-4824-8591-fd1c8d53f086-kube-api-access-fhqx6\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714766 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714825 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714886 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714971 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.715029 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.715066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.802438 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.802674 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" containerID="cri-o://cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" gracePeriod=30 Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816160 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqx6\" (UniqueName: \"kubernetes.io/projected/dcdf6037-c51f-4824-8591-fd1c8d53f086-kube-api-access-fhqx6\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816221 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816274 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816304 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816331 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816360 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816406 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816428 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816458 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-dir\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-policies\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.817306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-policies\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.817532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-dir\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.817715 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.818200 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.818727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.821727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.822175 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.825053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.827695 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.828120 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.835363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.835576 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.841403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.851258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqx6\" (UniqueName: \"kubernetes.io/projected/dcdf6037-c51f-4824-8591-fd1c8d53f086-kube-api-access-fhqx6\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.906792 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.907014 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" containerID="cri-o://6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" gracePeriod=30 Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.930370 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.426786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-z5t98"] Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.790068 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.820147 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930747 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930809 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930896 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930949 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.931013 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.931044 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.931078 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca" (OuterVolumeSpecName: "client-ca") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933207 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config" (OuterVolumeSpecName: "config") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933467 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca" (OuterVolumeSpecName: "client-ca") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config" (OuterVolumeSpecName: "config") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933601 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.936896 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.937039 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq" (OuterVolumeSpecName: "kube-api-access-h9wrq") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "kube-api-access-h9wrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.939616 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.944062 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2" (OuterVolumeSpecName: "kube-api-access-ckld2") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "kube-api-access-ckld2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986040 4773 generic.go:334] "Generic (PLEG): container finished" podID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" exitCode=0 Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986423 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerDied","Data":"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerDied","Data":"0d0c05a7620d0b2d084778129f9283d19b008663b280d13e07e0f66abacb3b84"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986467 4773 scope.go:117] "RemoveContainer" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986560 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.989851 4773 generic.go:334] "Generic (PLEG): container finished" podID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" exitCode=0 Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.989961 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.989921 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerDied","Data":"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.990091 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerDied","Data":"c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.991321 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" event={"ID":"dcdf6037-c51f-4824-8591-fd1c8d53f086","Type":"ContainerStarted","Data":"1d7df637dc9d820da1a692567e2778f8e8028898d08aa8c07ada286db74871dc"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.991349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" event={"ID":"dcdf6037-c51f-4824-8591-fd1c8d53f086","Type":"ContainerStarted","Data":"99eb80dcf4e4a84dbd0b1fb412df39cdd2792089b7a0eddb39eb6f92fe6882e7"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.992007 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.005665 4773 scope.go:117] "RemoveContainer" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.006969 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9\": container with ID starting with cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9 not found: ID does not exist" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.007036 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9"} err="failed to get container status \"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9\": rpc error: code = NotFound desc = could not find container \"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9\": container with ID starting with cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9 not found: ID does not exist" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.007066 4773 scope.go:117] "RemoveContainer" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.014104 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" podStartSLOduration=38.014088259 podStartE2EDuration="38.014088259s" podCreationTimestamp="2026-01-20 18:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:55.01179781 +0000 UTC m=+227.933610844" watchObservedRunningTime="2026-01-20 18:33:55.014088259 +0000 UTC m=+227.935901283" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.026855 4773 scope.go:117] "RemoveContainer" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.027293 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660\": container with ID starting with 6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660 not found: ID does not exist" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.027331 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660"} err="failed to get container status \"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660\": rpc error: code = NotFound desc = could not find container \"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660\": container with ID starting with 6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660 not found: ID does not exist" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033081 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033115 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033127 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033136 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033145 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033155 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033164 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033174 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033183 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.036449 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.040584 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.043488 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.045701 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.090807 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.454264 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" path="/var/lib/kubelet/pods/12c12fbe-31ec-4b46-ace6-ac3451850070/volumes" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.454895 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" path="/var/lib/kubelet/pods/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44/volumes" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614531 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f7578c85f-8k8g4"] Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.614725 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614736 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.614754 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614759 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614842 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614854 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.615213 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.622415 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.622469 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.622478 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.623531 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.623646 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.624192 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.627361 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.633830 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.635013 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640449 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640818 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640822 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640521 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.641107 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.641248 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.641773 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f7578c85f-8k8g4"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.647987 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743162 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-client-ca\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743226 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f300cc-6496-42e7-84ea-d542b110a9a7-serving-cert\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743256 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1c7708-bc85-43d7-ab64-9b2b99a43557-serving-cert\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743282 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-config\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743572 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-client-ca\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-proxy-ca-bundles\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743632 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-config\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743681 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjt9d\" (UniqueName: \"kubernetes.io/projected/76f300cc-6496-42e7-84ea-d542b110a9a7-kube-api-access-tjt9d\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98sgx\" (UniqueName: \"kubernetes.io/projected/0d1c7708-bc85-43d7-ab64-9b2b99a43557-kube-api-access-98sgx\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.844912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-config\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.844972 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjt9d\" (UniqueName: \"kubernetes.io/projected/76f300cc-6496-42e7-84ea-d542b110a9a7-kube-api-access-tjt9d\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845001 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98sgx\" (UniqueName: \"kubernetes.io/projected/0d1c7708-bc85-43d7-ab64-9b2b99a43557-kube-api-access-98sgx\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-client-ca\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f300cc-6496-42e7-84ea-d542b110a9a7-serving-cert\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1c7708-bc85-43d7-ab64-9b2b99a43557-serving-cert\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845094 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-config\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845115 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-client-ca\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845133 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-proxy-ca-bundles\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846168 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-client-ca\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846217 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-client-ca\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846228 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-proxy-ca-bundles\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-config\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846565 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-config\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.849597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1c7708-bc85-43d7-ab64-9b2b99a43557-serving-cert\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.852522 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f300cc-6496-42e7-84ea-d542b110a9a7-serving-cert\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.865263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjt9d\" (UniqueName: \"kubernetes.io/projected/76f300cc-6496-42e7-84ea-d542b110a9a7-kube-api-access-tjt9d\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.875725 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98sgx\" (UniqueName: \"kubernetes.io/projected/0d1c7708-bc85-43d7-ab64-9b2b99a43557-kube-api-access-98sgx\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.939604 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.949744 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:56 crc kubenswrapper[4773]: I0120 18:33:56.376621 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9"] Jan 20 18:33:56 crc kubenswrapper[4773]: I0120 18:33:56.381292 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f7578c85f-8k8g4"] Jan 20 18:33:56 crc kubenswrapper[4773]: W0120 18:33:56.387909 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f300cc_6496_42e7_84ea_d542b110a9a7.slice/crio-e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2 WatchSource:0}: Error finding container e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2: Status 404 returned error can't find the container with id e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2 Jan 20 18:33:56 crc kubenswrapper[4773]: W0120 18:33:56.395002 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1c7708_bc85_43d7_ab64_9b2b99a43557.slice/crio-612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615 WatchSource:0}: Error finding container 612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615: Status 404 returned error can't find the container with id 612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615 Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.007316 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" event={"ID":"76f300cc-6496-42e7-84ea-d542b110a9a7","Type":"ContainerStarted","Data":"fc12f458c70ba48630ffe807737912e0cb47e903d0ec8f933085e5c2e0706a3b"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.007756 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.007794 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" event={"ID":"76f300cc-6496-42e7-84ea-d542b110a9a7","Type":"ContainerStarted","Data":"e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.009161 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" event={"ID":"0d1c7708-bc85-43d7-ab64-9b2b99a43557","Type":"ContainerStarted","Data":"82f8263c71807bdbfb6056daae4d133d9aca910f95392ba71950410bf787b172"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.009205 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" event={"ID":"0d1c7708-bc85-43d7-ab64-9b2b99a43557","Type":"ContainerStarted","Data":"612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.009455 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.015031 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.016597 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.052768 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" podStartSLOduration=4.052743328 podStartE2EDuration="4.052743328s" podCreationTimestamp="2026-01-20 18:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:57.032548431 +0000 UTC m=+229.954361465" watchObservedRunningTime="2026-01-20 18:33:57.052743328 +0000 UTC m=+229.974556352" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.089609 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" podStartSLOduration=4.0895873 podStartE2EDuration="4.0895873s" podCreationTimestamp="2026-01-20 18:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:57.054706858 +0000 UTC m=+229.976519882" watchObservedRunningTime="2026-01-20 18:33:57.0895873 +0000 UTC m=+230.011400324" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.026271 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027349 4773 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027477 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027666 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027768 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027782 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027864 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027843 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028232 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028379 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028398 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028409 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028417 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028428 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028435 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028444 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028451 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028465 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028473 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028484 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028494 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028503 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028510 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028617 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028630 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028637 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028646 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028657 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028845 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099625 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099687 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200439 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200485 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200518 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200556 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200613 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200647 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200675 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200709 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.009174 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.009639 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.084476 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.086338 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087177 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087233 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087249 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087263 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" exitCode=2 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087300 4773 scope.go:117] "RemoveContainer" containerID="fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.089996 4773 generic.go:334] "Generic (PLEG): container finished" podID="29920243-d87d-49b3-9215-680935300c6e" containerID="87629fae04c4c3f2e756aceedc76b8a1d45cb15dd8a81bfaef042af220c86cad" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.090075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerDied","Data":"87629fae04c4c3f2e756aceedc76b8a1d45cb15dd8a81bfaef042af220c86cad"} Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.091451 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.091889 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.127343 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.459412 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.460861 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545527 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"29920243-d87d-49b3-9215-680935300c6e\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"29920243-d87d-49b3-9215-680935300c6e\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545655 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"29920243-d87d-49b3-9215-680935300c6e\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545649 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29920243-d87d-49b3-9215-680935300c6e" (UID: "29920243-d87d-49b3-9215-680935300c6e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545947 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock" (OuterVolumeSpecName: "var-lock") pod "29920243-d87d-49b3-9215-680935300c6e" (UID: "29920243-d87d-49b3-9215-680935300c6e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.546069 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.546124 4773 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.554520 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29920243-d87d-49b3-9215-680935300c6e" (UID: "29920243-d87d-49b3-9215-680935300c6e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.647252 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.884268 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.884970 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.885896 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.886132 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949726 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949820 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949993 4773 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.950008 4773 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.950017 4773 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.136688 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerDied","Data":"425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917"} Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.136718 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.136737 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139294 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139853 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" exitCode=0 Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139901 4773 scope.go:117] "RemoveContainer" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139978 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.152609 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.152996 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.154059 4773 scope.go:117] "RemoveContainer" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.160127 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.161983 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.168665 4773 scope.go:117] "RemoveContainer" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.184247 4773 scope.go:117] "RemoveContainer" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.201254 4773 scope.go:117] "RemoveContainer" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.217497 4773 scope.go:117] "RemoveContainer" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.237327 4773 scope.go:117] "RemoveContainer" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.237849 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\": container with ID starting with c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da not found: ID does not exist" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.237956 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da"} err="failed to get container status \"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\": rpc error: code = NotFound desc = could not find container \"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\": container with ID starting with c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.237996 4773 scope.go:117] "RemoveContainer" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.238437 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\": container with ID starting with 06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0 not found: ID does not exist" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238468 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0"} err="failed to get container status \"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\": rpc error: code = NotFound desc = could not find container \"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\": container with ID starting with 06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238487 4773 scope.go:117] "RemoveContainer" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.238791 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\": container with ID starting with af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1 not found: ID does not exist" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238859 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1"} err="failed to get container status \"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\": rpc error: code = NotFound desc = could not find container \"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\": container with ID starting with af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238897 4773 scope.go:117] "RemoveContainer" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.239222 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\": container with ID starting with 05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b not found: ID does not exist" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239270 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b"} err="failed to get container status \"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\": rpc error: code = NotFound desc = could not find container \"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\": container with ID starting with 05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239303 4773 scope.go:117] "RemoveContainer" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.239578 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\": container with ID starting with 13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7 not found: ID does not exist" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239626 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7"} err="failed to get container status \"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\": rpc error: code = NotFound desc = could not find container \"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\": container with ID starting with 13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239657 4773 scope.go:117] "RemoveContainer" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.240291 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\": container with ID starting with 598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554 not found: ID does not exist" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.240376 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554"} err="failed to get container status \"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\": rpc error: code = NotFound desc = could not find container \"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\": container with ID starting with 598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.460359 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.507886 4773 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" volumeName="registry-storage" Jan 20 18:34:13 crc kubenswrapper[4773]: E0120 18:34:13.063659 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:13 crc kubenswrapper[4773]: I0120 18:34:13.064407 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:13 crc kubenswrapper[4773]: W0120 18:34:13.089251 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa WatchSource:0}: Error finding container 29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa: Status 404 returned error can't find the container with id 29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa Jan 20 18:34:13 crc kubenswrapper[4773]: E0120 18:34:13.092397 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c842629a427ed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,LastTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:34:13 crc kubenswrapper[4773]: I0120 18:34:13.152827 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa"} Jan 20 18:34:14 crc kubenswrapper[4773]: I0120 18:34:14.159382 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37"} Jan 20 18:34:14 crc kubenswrapper[4773]: E0120 18:34:14.160511 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:14 crc kubenswrapper[4773]: I0120 18:34:14.160540 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:15 crc kubenswrapper[4773]: E0120 18:34:15.163993 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.921679 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.922844 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.923441 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.923866 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.924179 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: I0120 18:34:16.924216 4773 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.924456 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="200ms" Jan 20 18:34:17 crc kubenswrapper[4773]: E0120 18:34:17.125910 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="400ms" Jan 20 18:34:17 crc kubenswrapper[4773]: I0120 18:34:17.448893 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:17 crc kubenswrapper[4773]: E0120 18:34:17.528474 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="800ms" Jan 20 18:34:18 crc kubenswrapper[4773]: E0120 18:34:18.329684 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="1.6s" Jan 20 18:34:19 crc kubenswrapper[4773]: E0120 18:34:19.931141 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="3.2s" Jan 20 18:34:20 crc kubenswrapper[4773]: E0120 18:34:20.917239 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c842629a427ed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,LastTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.446301 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.447176 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.467336 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.467407 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:21 crc kubenswrapper[4773]: E0120 18:34:21.468317 4773 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.468856 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.300294 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.300573 4773 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3" exitCode=1 Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.300626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3"} Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.301108 4773 scope.go:117] "RemoveContainer" containerID="c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.301893 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.302121 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304493 4773 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1f6e095fd190521be17fa722c72a1c13cc689ee9387657247cc7961515edf235" exitCode=0 Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304526 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1f6e095fd190521be17fa722c72a1c13cc689ee9387657247cc7961515edf235"} Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304547 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7f1b5519ab31867be76cc5e32ab6f786b949224ff11821a9e64b396b0ced0244"} Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304758 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304780 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:22 crc kubenswrapper[4773]: E0120 18:34:22.305307 4773 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.305455 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.306040 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.316836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7bae06e1d36b5c1a48d7b03dcb92788dafc5bbc46f28fde6fe206552481ca182"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.317312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"175c4314888c1cfd7476913515672d64fea343cf7e42b8c235738032ed3bb660"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.317342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0ec4d23a83fd229f96b0110e5b6f9775e656e764687107efc5fb3a550b539527"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.317358 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"948d8449b8652d87d8f3bd7c7e0012c9110f918045322ea2dfdeef12d9349dbf"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.324771 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.324837 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fcf5c8acfe82bdbf72ee4e18bd54560aec372bb9e0cd0499181010c00caac3f3"} Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.333957 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"da815e130dc9b7ec0d00e5842d09e74606a90648632c926e4885b156e56f5cae"} Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.334208 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.334410 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.334448 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.469183 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.469492 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.477066 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.756261 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:27 crc kubenswrapper[4773]: I0120 18:34:27.432287 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:27 crc kubenswrapper[4773]: I0120 18:34:27.439417 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.345072 4773 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.361145 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.361189 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.365468 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.368772 4773 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bb237516-46e4-442a-9643-74c0fbd4b05a" Jan 20 18:34:30 crc kubenswrapper[4773]: I0120 18:34:30.365080 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:30 crc kubenswrapper[4773]: I0120 18:34:30.365110 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:36 crc kubenswrapper[4773]: I0120 18:34:36.760010 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:37 crc kubenswrapper[4773]: I0120 18:34:37.455207 4773 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bb237516-46e4-442a-9643-74c0fbd4b05a" Jan 20 18:34:38 crc kubenswrapper[4773]: I0120 18:34:38.899258 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 18:34:38 crc kubenswrapper[4773]: I0120 18:34:38.970168 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 18:34:39 crc kubenswrapper[4773]: I0120 18:34:39.482640 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 18:34:39 crc kubenswrapper[4773]: I0120 18:34:39.565695 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 18:34:39 crc kubenswrapper[4773]: I0120 18:34:39.625979 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.125055 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.171244 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.196243 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.341007 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.477792 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.491011 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.646344 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.679574 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.807704 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.882419 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.052225 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.531799 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.653852 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.711994 4773 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.719642 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.725217 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.790601 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.829126 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.853083 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.940440 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.031618 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.038390 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.168846 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.286280 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.310895 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.356597 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.373334 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.429569 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.553007 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.567861 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.603740 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.741995 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.801879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.975387 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.994112 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.069209 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.094193 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.315814 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.317750 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.350475 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.386166 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.521742 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.545628 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.552879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.561760 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.617123 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.708031 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.718040 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.788920 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.836008 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.844861 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.875622 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.993713 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.094886 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.252718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.369596 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.372386 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.416545 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.486055 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.505657 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.505716 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.550873 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.569975 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.666804 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.730815 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.736535 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.738497 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.795786 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.830923 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.864027 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.907829 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.910530 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.912592 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.981876 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.990468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.992212 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.018290 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.036227 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.056406 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.279181 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.281243 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.316858 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.345065 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.347382 4773 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.347916 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.370271 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.520540 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.592693 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.600846 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.684649 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.726792 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.767527 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.772832 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.942063 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.945406 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.012828 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.037548 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.041238 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.071764 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.095811 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.182775 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.273864 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.489185 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.507677 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.599754 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.669950 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.837429 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.837893 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.897666 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.938771 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.167728 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.187535 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.197840 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.198554 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.227981 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.242994 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.264682 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.289652 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.346785 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.421534 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.449494 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.467973 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.483922 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.502624 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.513087 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.533773 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.551050 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.625890 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.645436 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.652206 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.669274 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.712854 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.734023 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.788652 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.789229 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.789872 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.803291 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.811547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.847300 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.851341 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.077839 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.115458 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.145141 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.204497 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.218952 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.284579 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.322241 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.414628 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.632640 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.676702 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.725756 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.844089 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.909524 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:48.919515 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.130428 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.130443 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.170452 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.171715 4773 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.262377 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.326973 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.468702 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.486895 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.533403 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.598919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.669430 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.774878 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.016844 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.207015 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.242355 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.328658 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.353595 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.414831 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.464711 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.594498 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.624043 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.648199 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.731751 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.866121 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.892418 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.947315 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.997425 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.079336 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.151644 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.163115 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.195148 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.210093 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.305107 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.368375 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.373707 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.425812 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.481077 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.542489 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.578062 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.585913 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.592566 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.731104 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.746226 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.747691 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.808079 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.868324 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.878372 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.880410 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.901891 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.004111 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.056596 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.080395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.121202 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.137216 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.266688 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.335371 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.369286 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.514401 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.674895 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.675206 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.703109 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.711817 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.779278 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.785004 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.841231 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.849785 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.912329 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.987471 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.041074 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.234770 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.313271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.350835 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.410904 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.460043 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.487300 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.564025 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.572572 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.606419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.669183 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.704317 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.909438 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.959718 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.963466 4773 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.967787 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.967838 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.973042 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.981400 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.986153 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.986136936 podStartE2EDuration="24.986136936s" podCreationTimestamp="2026-01-20 18:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:34:53.985504521 +0000 UTC m=+286.907317555" watchObservedRunningTime="2026-01-20 18:34:53.986136936 +0000 UTC m=+286.907949960" Jan 20 18:34:54 crc kubenswrapper[4773]: I0120 18:34:54.564497 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 18:34:54 crc kubenswrapper[4773]: I0120 18:34:54.826673 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 18:34:55 crc kubenswrapper[4773]: I0120 18:34:55.019175 4773 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:55 crc kubenswrapper[4773]: I0120 18:34:55.155804 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 18:34:55 crc kubenswrapper[4773]: I0120 18:34:55.697431 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.196085 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.315713 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.340078 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.612248 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 18:35:03 crc kubenswrapper[4773]: I0120 18:35:03.195315 4773 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 18:35:03 crc kubenswrapper[4773]: I0120 18:35:03.195946 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37" gracePeriod=5 Jan 20 18:35:07 crc kubenswrapper[4773]: I0120 18:35:07.261241 4773 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.623457 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.623801 4773 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37" exitCode=137 Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.765307 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.765567 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.925771 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926115 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926164 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926306 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926407 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926333 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926458 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926508 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926786 4773 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926880 4773 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.927008 4773 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.934561 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.027742 4773 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.027798 4773 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.452646 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.636000 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.636636 4773 scope.go:117] "RemoveContainer" containerID="ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.636751 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:35:58 crc kubenswrapper[4773]: I0120 18:35:58.170091 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:35:58 crc kubenswrapper[4773]: I0120 18:35:58.170720 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.666527 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2lbk"] Jan 20 18:36:24 crc kubenswrapper[4773]: E0120 18:36:24.667620 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29920243-d87d-49b3-9215-680935300c6e" containerName="installer" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667643 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="29920243-d87d-49b3-9215-680935300c6e" containerName="installer" Jan 20 18:36:24 crc kubenswrapper[4773]: E0120 18:36:24.667674 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667685 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667854 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667869 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="29920243-d87d-49b3-9215-680935300c6e" containerName="installer" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.668457 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.700355 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2lbk"] Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767101 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-bound-sa-token\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-certificates\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-trusted-ca\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-tls\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767419 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767466 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f61a15ab-aa00-47cf-8385-57f3e148832e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk5lf\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-kube-api-access-nk5lf\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767694 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f61a15ab-aa00-47cf-8385-57f3e148832e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.786449 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk5lf\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-kube-api-access-nk5lf\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868913 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f61a15ab-aa00-47cf-8385-57f3e148832e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-bound-sa-token\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868977 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-certificates\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-trusted-ca\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.869011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-tls\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.869034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f61a15ab-aa00-47cf-8385-57f3e148832e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.869460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f61a15ab-aa00-47cf-8385-57f3e148832e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.870375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-trusted-ca\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.870980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-certificates\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.874069 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f61a15ab-aa00-47cf-8385-57f3e148832e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.874119 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-tls\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.886594 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk5lf\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-kube-api-access-nk5lf\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.889285 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-bound-sa-token\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.988854 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:25 crc kubenswrapper[4773]: I0120 18:36:25.169050 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2lbk"] Jan 20 18:36:25 crc kubenswrapper[4773]: W0120 18:36:25.174193 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf61a15ab_aa00_47cf_8385_57f3e148832e.slice/crio-bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594 WatchSource:0}: Error finding container bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594: Status 404 returned error can't find the container with id bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594 Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.041488 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" event={"ID":"f61a15ab-aa00-47cf-8385-57f3e148832e","Type":"ContainerStarted","Data":"3bc63f71dd80f941c8bf8b10d174ceb11ea94c90531a13b46795b9c38db75e0d"} Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.041904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" event={"ID":"f61a15ab-aa00-47cf-8385-57f3e148832e","Type":"ContainerStarted","Data":"bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594"} Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.042149 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.063292 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" podStartSLOduration=2.06327423 podStartE2EDuration="2.06327423s" podCreationTimestamp="2026-01-20 18:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:36:26.063106557 +0000 UTC m=+378.984919571" watchObservedRunningTime="2026-01-20 18:36:26.06327423 +0000 UTC m=+378.985087254" Jan 20 18:36:28 crc kubenswrapper[4773]: I0120 18:36:28.170596 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:36:28 crc kubenswrapper[4773]: I0120 18:36:28.171215 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.965575 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.966750 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v75d6" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" containerID="cri-o://20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41" gracePeriod=30 Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.980035 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.980513 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2qdpl" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" containerID="cri-o://042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d" gracePeriod=30 Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.987459 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.995550 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" containerID="cri-o://cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3" gracePeriod=30 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:38.999206 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:38.999524 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" containerID="cri-o://f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" gracePeriod=30 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.025003 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kcc74"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.025993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.034170 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.034370 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fm4ln" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" containerID="cri-o://405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" gracePeriod=30 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.045326 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kcc74"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.133973 4773 generic.go:334] "Generic (PLEG): container finished" podID="074f367d-7a48-4046-a679-9a2d38111b8a" containerID="20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.134083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.137103 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.137214 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.139731 4773 generic.go:334] "Generic (PLEG): container finished" podID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerID="042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.139815 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.141075 4773 generic.go:334] "Generic (PLEG): container finished" podID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerID="cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.141108 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerDied","Data":"cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.191028 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6pn\" (UniqueName: \"kubernetes.io/projected/785e6f78-9a81-429e-8cad-f60275661e58-kube-api-access-pr6pn\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.191106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.191138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.261391 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.261694 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.262079 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.262116 4773 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.292832 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6pn\" (UniqueName: \"kubernetes.io/projected/785e6f78-9a81-429e-8cad-f60275661e58-kube-api-access-pr6pn\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.293206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.293235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.296140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.299320 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.314830 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6pn\" (UniqueName: \"kubernetes.io/projected/785e6f78-9a81-429e-8cad-f60275661e58-kube-api-access-pr6pn\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.347830 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.397440 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.487988 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.500313 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.503194 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.548346 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"e5fd624a-2fa6-4887-83e0-779057846c71\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595501 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"074f367d-7a48-4046-a679-9a2d38111b8a\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"e5fd624a-2fa6-4887-83e0-779057846c71\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595575 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"074f367d-7a48-4046-a679-9a2d38111b8a\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595641 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"074f367d-7a48-4046-a679-9a2d38111b8a\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595661 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"e5fd624a-2fa6-4887-83e0-779057846c71\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.596911 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities" (OuterVolumeSpecName: "utilities") pod "e5fd624a-2fa6-4887-83e0-779057846c71" (UID: "e5fd624a-2fa6-4887-83e0-779057846c71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.597893 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities" (OuterVolumeSpecName: "utilities") pod "074f367d-7a48-4046-a679-9a2d38111b8a" (UID: "074f367d-7a48-4046-a679-9a2d38111b8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.598698 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l" (OuterVolumeSpecName: "kube-api-access-fbj8l") pod "074f367d-7a48-4046-a679-9a2d38111b8a" (UID: "074f367d-7a48-4046-a679-9a2d38111b8a"). InnerVolumeSpecName "kube-api-access-fbj8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.600710 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk" (OuterVolumeSpecName: "kube-api-access-vzmpk") pod "e5fd624a-2fa6-4887-83e0-779057846c71" (UID: "e5fd624a-2fa6-4887-83e0-779057846c71"). InnerVolumeSpecName "kube-api-access-vzmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.647435 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "074f367d-7a48-4046-a679-9a2d38111b8a" (UID: "074f367d-7a48-4046-a679-9a2d38111b8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.696841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.696892 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697310 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697404 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"8923f3c0-0b58-4097-aa87-9df34cf90e41\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697512 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"8923f3c0-0b58-4097-aa87-9df34cf90e41\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.698399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities" (OuterVolumeSpecName: "utilities") pod "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" (UID: "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.699335 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities" (OuterVolumeSpecName: "utilities") pod "8923f3c0-0b58-4097-aa87-9df34cf90e41" (UID: "8923f3c0-0b58-4097-aa87-9df34cf90e41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.699738 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc" (OuterVolumeSpecName: "kube-api-access-dppcc") pod "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" (UID: "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2"). InnerVolumeSpecName "kube-api-access-dppcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.699980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" (UID: "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.701368 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz" (OuterVolumeSpecName: "kube-api-access-hngfz") pod "8923f3c0-0b58-4097-aa87-9df34cf90e41" (UID: "8923f3c0-0b58-4097-aa87-9df34cf90e41"). InnerVolumeSpecName "kube-api-access-hngfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.703257 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq" (OuterVolumeSpecName: "kube-api-access-kfrvq") pod "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" (UID: "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85"). InnerVolumeSpecName "kube-api-access-kfrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.703383 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"8923f3c0-0b58-4097-aa87-9df34cf90e41\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.703429 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704005 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704031 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704040 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704048 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704057 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704066 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704075 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704085 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704094 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704104 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704112 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" (UID: "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.719554 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" (UID: "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.720353 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5fd624a-2fa6-4887-83e0-779057846c71" (UID: "e5fd624a-2fa6-4887-83e0-779057846c71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.749741 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8923f3c0-0b58-4097-aa87-9df34cf90e41" (UID: "8923f3c0-0b58-4097-aa87-9df34cf90e41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807431 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807469 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807479 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807487 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.814498 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kcc74"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.147264 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" event={"ID":"785e6f78-9a81-429e-8cad-f60275661e58","Type":"ContainerStarted","Data":"de423276dc21445682b64e1f6484fa7b1373ba1de2188b140b0b53670942a901"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.147645 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" event={"ID":"785e6f78-9a81-429e-8cad-f60275661e58","Type":"ContainerStarted","Data":"7d6db5ee2e2391fe0aa6f228d1a5ee2222dbdff05012c5d45d6ddb93ffe1ce17"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.147663 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.148960 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kcc74 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.149004 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" podUID="785e6f78-9a81-429e-8cad-f60275661e58" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.149591 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.150965 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerDied","Data":"f4d91eb42c30324decc0123b0752b77625e1bfc343e356223cf0e111b47451d8"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.151002 4773 scope.go:117] "RemoveContainer" containerID="cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153501 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5fd624a-2fa6-4887-83e0-779057846c71" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" exitCode=0 Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153617 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153572 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.157014 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.157013 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"f57839f90df36cf23471ecda170b0c2440316e257ee6cca520a35c728d5b16de"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.159984 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.159999 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.164173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"47261c669f243247e3360eb031003a9925a21eac9889414fbe72f5ed85389a71"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.164221 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.168723 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" podStartSLOduration=2.168700018 podStartE2EDuration="2.168700018s" podCreationTimestamp="2026-01-20 18:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:36:40.164378626 +0000 UTC m=+393.086191670" watchObservedRunningTime="2026-01-20 18:36:40.168700018 +0000 UTC m=+393.090513042" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.177520 4773 scope.go:117] "RemoveContainer" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.198822 4773 scope.go:117] "RemoveContainer" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.202674 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.214873 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.228255 4773 scope.go:117] "RemoveContainer" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.230369 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.244569 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.247886 4773 scope.go:117] "RemoveContainer" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.249092 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1\": container with ID starting with 405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1 not found: ID does not exist" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249136 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1"} err="failed to get container status \"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1\": rpc error: code = NotFound desc = could not find container \"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1\": container with ID starting with 405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1 not found: ID does not exist" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249163 4773 scope.go:117] "RemoveContainer" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.249459 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db\": container with ID starting with 3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db not found: ID does not exist" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249475 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db"} err="failed to get container status \"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db\": rpc error: code = NotFound desc = could not find container \"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db\": container with ID starting with 3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db not found: ID does not exist" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249488 4773 scope.go:117] "RemoveContainer" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.250330 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee\": container with ID starting with bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee not found: ID does not exist" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.250363 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee"} err="failed to get container status \"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee\": rpc error: code = NotFound desc = could not find container \"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee\": container with ID starting with bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee not found: ID does not exist" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.250386 4773 scope.go:117] "RemoveContainer" containerID="20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.254491 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.260452 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.266830 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.272283 4773 scope.go:117] "RemoveContainer" containerID="639cb92015efe00db0ab47ee3303403c16b478767a9030340d303516dfaf2e8d" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.272865 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.282419 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.285073 4773 scope.go:117] "RemoveContainer" containerID="1f220958033235a6f9fe2c2b2ebf17e5764f53dc8958a6ac265dd5f47e11eb7e" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.285463 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.304913 4773 scope.go:117] "RemoveContainer" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.321192 4773 scope.go:117] "RemoveContainer" containerID="93be4aeabcc19519cd8451ce33f2117a534a92e9ae0b9b81378e69932d400b91" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.339681 4773 scope.go:117] "RemoveContainer" containerID="ca09fdf16fb11b2b53675f7f94bd271507c5f87bdae39648e8c76e9ebf18f6ca" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.365163 4773 scope.go:117] "RemoveContainer" containerID="042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.380429 4773 scope.go:117] "RemoveContainer" containerID="1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.395347 4773 scope.go:117] "RemoveContainer" containerID="2fb65d95b1dd9e1def202549dcf0c536be64e92ad04a8874773fb7a70a7be1b9" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.783476 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lnll4"] Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784106 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784130 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784143 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784151 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784161 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784169 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784180 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784188 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784200 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784208 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784218 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784226 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784240 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784248 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784256 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784263 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784277 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784285 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784294 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784301 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784311 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784318 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784327 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784335 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784348 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784355 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784483 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784495 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784505 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784513 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784519 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.785809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.788090 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.792082 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnll4"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.834227 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js59s\" (UniqueName: \"kubernetes.io/projected/5da64480-a8e7-4ab9-b438-dfe067f94091-kube-api-access-js59s\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.834303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-catalog-content\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.834411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-utilities\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.936084 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-utilities\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.936176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js59s\" (UniqueName: \"kubernetes.io/projected/5da64480-a8e7-4ab9-b438-dfe067f94091-kube-api-access-js59s\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.936214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-catalog-content\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.937032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-catalog-content\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.937050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-utilities\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.959238 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js59s\" (UniqueName: \"kubernetes.io/projected/5da64480-a8e7-4ab9-b438-dfe067f94091-kube-api-access-js59s\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.133354 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.181010 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.345197 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnll4"] Jan 20 18:36:41 crc kubenswrapper[4773]: W0120 18:36:41.368349 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da64480_a8e7_4ab9_b438_dfe067f94091.slice/crio-d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c WatchSource:0}: Error finding container d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c: Status 404 returned error can't find the container with id d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.454578 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" path="/var/lib/kubelet/pods/074f367d-7a48-4046-a679-9a2d38111b8a/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.455296 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" path="/var/lib/kubelet/pods/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.455732 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" path="/var/lib/kubelet/pods/8923f3c0-0b58-4097-aa87-9df34cf90e41/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.456776 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" path="/var/lib/kubelet/pods/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.457619 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" path="/var/lib/kubelet/pods/e5fd624a-2fa6-4887-83e0-779057846c71/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.779363 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wdwbg"] Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.780309 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.783369 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.802351 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdwbg"] Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.847727 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n8jw\" (UniqueName: \"kubernetes.io/projected/379f8421-1b6c-45c5-ae56-051b42ff6410-kube-api-access-2n8jw\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.847815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-utilities\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.847861 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-catalog-content\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.949682 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n8jw\" (UniqueName: \"kubernetes.io/projected/379f8421-1b6c-45c5-ae56-051b42ff6410-kube-api-access-2n8jw\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.949736 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-utilities\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.949766 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-catalog-content\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.950521 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-catalog-content\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.950565 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-utilities\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.973068 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n8jw\" (UniqueName: \"kubernetes.io/projected/379f8421-1b6c-45c5-ae56-051b42ff6410-kube-api-access-2n8jw\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.109308 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.187391 4773 generic.go:334] "Generic (PLEG): container finished" podID="5da64480-a8e7-4ab9-b438-dfe067f94091" containerID="d9f7e3449020892e7ad4a475051e5957ae64148987fa051d3306ccdce3086869" exitCode=0 Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.187541 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerDied","Data":"d9f7e3449020892e7ad4a475051e5957ae64148987fa051d3306ccdce3086869"} Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.187596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerStarted","Data":"d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c"} Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.523673 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdwbg"] Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.175322 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r24nn"] Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.178401 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.183666 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.186454 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r24nn"] Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.201799 4773 generic.go:334] "Generic (PLEG): container finished" podID="379f8421-1b6c-45c5-ae56-051b42ff6410" containerID="4a23108bb87b7b34f2c6a6518788bcca556bbe160ca79537ca82165cbb3dfb8f" exitCode=0 Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.201856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerDied","Data":"4a23108bb87b7b34f2c6a6518788bcca556bbe160ca79537ca82165cbb3dfb8f"} Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.201879 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerStarted","Data":"250c3a8173a2d1bf9e0b3726862b2b3ea37f90f03a172273b24b5f16a6378d3c"} Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.209088 4773 generic.go:334] "Generic (PLEG): container finished" podID="5da64480-a8e7-4ab9-b438-dfe067f94091" containerID="07365db81d19a57af32492ba798cf44af6edc2c8c1a6bd0c2614bc04e6e9066a" exitCode=0 Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.209143 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerDied","Data":"07365db81d19a57af32492ba798cf44af6edc2c8c1a6bd0c2614bc04e6e9066a"} Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.266354 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ckqw\" (UniqueName: \"kubernetes.io/projected/7962399c-d4d0-44f1-a788-bd4cb5a758d7-kube-api-access-8ckqw\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.266398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-catalog-content\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.266441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-utilities\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ckqw\" (UniqueName: \"kubernetes.io/projected/7962399c-d4d0-44f1-a788-bd4cb5a758d7-kube-api-access-8ckqw\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367495 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-catalog-content\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367530 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-utilities\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367952 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-utilities\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.368384 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-catalog-content\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.391247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ckqw\" (UniqueName: \"kubernetes.io/projected/7962399c-d4d0-44f1-a788-bd4cb5a758d7-kube-api-access-8ckqw\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.506251 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.871381 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r24nn"] Jan 20 18:36:43 crc kubenswrapper[4773]: W0120 18:36:43.879673 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7962399c_d4d0_44f1_a788_bd4cb5a758d7.slice/crio-ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5 WatchSource:0}: Error finding container ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5: Status 404 returned error can't find the container with id ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.177510 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cfwbf"] Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.183534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.190001 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.195690 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfwbf"] Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.216121 4773 generic.go:334] "Generic (PLEG): container finished" podID="379f8421-1b6c-45c5-ae56-051b42ff6410" containerID="fd1f902118fb9db32d3da8ab9596feec0155aa7e95e11ea6a0363437b4776f3e" exitCode=0 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.216303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerDied","Data":"fd1f902118fb9db32d3da8ab9596feec0155aa7e95e11ea6a0363437b4776f3e"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.219816 4773 generic.go:334] "Generic (PLEG): container finished" podID="7962399c-d4d0-44f1-a788-bd4cb5a758d7" containerID="485255b30ab14edef9260ad2522abe87424691af4e1eb171e6c5173a8679fb83" exitCode=0 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.220464 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerDied","Data":"485255b30ab14edef9260ad2522abe87424691af4e1eb171e6c5173a8679fb83"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.220581 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerStarted","Data":"ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.225372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerStarted","Data":"2a01cf48237122fbba8c740a3574c7640592732f28ea2f3c94c797478d8e1570"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.254909 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lnll4" podStartSLOduration=2.823319655 podStartE2EDuration="4.254896373s" podCreationTimestamp="2026-01-20 18:36:40 +0000 UTC" firstStartedPulling="2026-01-20 18:36:42.190053026 +0000 UTC m=+395.111866050" lastFinishedPulling="2026-01-20 18:36:43.621629714 +0000 UTC m=+396.543442768" observedRunningTime="2026-01-20 18:36:44.254717529 +0000 UTC m=+397.176530573" watchObservedRunningTime="2026-01-20 18:36:44.254896373 +0000 UTC m=+397.176709397" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.283650 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnw9\" (UniqueName: \"kubernetes.io/projected/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-kube-api-access-dnnw9\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.283710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-utilities\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.283748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-catalog-content\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnw9\" (UniqueName: \"kubernetes.io/projected/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-kube-api-access-dnnw9\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-utilities\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-catalog-content\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-catalog-content\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385962 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-utilities\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.407275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnw9\" (UniqueName: \"kubernetes.io/projected/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-kube-api-access-dnnw9\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.511039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.758901 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfwbf"] Jan 20 18:36:44 crc kubenswrapper[4773]: W0120 18:36:44.767877 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ccd26b_7e5c_4655_a9cb_764a2d7d35d3.slice/crio-90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322 WatchSource:0}: Error finding container 90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322: Status 404 returned error can't find the container with id 90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.995287 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.054728 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.233831 4773 generic.go:334] "Generic (PLEG): container finished" podID="a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3" containerID="552ac0644b31a05767b5893b59defbc42ee6dff8158f25f2bbb4a7da0e807835" exitCode=0 Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.234082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerDied","Data":"552ac0644b31a05767b5893b59defbc42ee6dff8158f25f2bbb4a7da0e807835"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.238841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerStarted","Data":"90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.239199 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerStarted","Data":"4ddd2085ebf13d395dc92e1fbe5f131161bb777ca68bb10a2a54266e7778169f"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.242600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerStarted","Data":"3be6d233735be758353181960bcdc3b7c9187830dd71ad03d1f5404cd35e259a"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.283436 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wdwbg" podStartSLOduration=2.860261285 podStartE2EDuration="4.283420704s" podCreationTimestamp="2026-01-20 18:36:41 +0000 UTC" firstStartedPulling="2026-01-20 18:36:43.203272256 +0000 UTC m=+396.125085280" lastFinishedPulling="2026-01-20 18:36:44.626431675 +0000 UTC m=+397.548244699" observedRunningTime="2026-01-20 18:36:45.281233363 +0000 UTC m=+398.203046397" watchObservedRunningTime="2026-01-20 18:36:45.283420704 +0000 UTC m=+398.205233728" Jan 20 18:36:46 crc kubenswrapper[4773]: I0120 18:36:46.249093 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerStarted","Data":"b91b473ee7a3625c5bf90c24fbde333f921c672087d27307e232e1e07fac0731"} Jan 20 18:36:46 crc kubenswrapper[4773]: I0120 18:36:46.251618 4773 generic.go:334] "Generic (PLEG): container finished" podID="7962399c-d4d0-44f1-a788-bd4cb5a758d7" containerID="3be6d233735be758353181960bcdc3b7c9187830dd71ad03d1f5404cd35e259a" exitCode=0 Jan 20 18:36:46 crc kubenswrapper[4773]: I0120 18:36:46.251675 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerDied","Data":"3be6d233735be758353181960bcdc3b7c9187830dd71ad03d1f5404cd35e259a"} Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.257368 4773 generic.go:334] "Generic (PLEG): container finished" podID="a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3" containerID="b91b473ee7a3625c5bf90c24fbde333f921c672087d27307e232e1e07fac0731" exitCode=0 Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.257432 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerDied","Data":"b91b473ee7a3625c5bf90c24fbde333f921c672087d27307e232e1e07fac0731"} Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.261400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerStarted","Data":"dbbf71e25e3224bd3743e609e818a66704a2d9f07db792aec73f085912fa58de"} Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.297515 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r24nn" podStartSLOduration=1.844056438 podStartE2EDuration="4.297496571s" podCreationTimestamp="2026-01-20 18:36:43 +0000 UTC" firstStartedPulling="2026-01-20 18:36:44.22221502 +0000 UTC m=+397.144028044" lastFinishedPulling="2026-01-20 18:36:46.675655133 +0000 UTC m=+399.597468177" observedRunningTime="2026-01-20 18:36:47.295083344 +0000 UTC m=+400.216896368" watchObservedRunningTime="2026-01-20 18:36:47.297496571 +0000 UTC m=+400.219309585" Jan 20 18:36:50 crc kubenswrapper[4773]: I0120 18:36:50.281474 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerStarted","Data":"93607e4df958788ebb45f47062e62f14c9ee37bf6b263a952e3b8aed59c3c6f5"} Jan 20 18:36:50 crc kubenswrapper[4773]: I0120 18:36:50.299534 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cfwbf" podStartSLOduration=3.308369548 podStartE2EDuration="6.29951701s" podCreationTimestamp="2026-01-20 18:36:44 +0000 UTC" firstStartedPulling="2026-01-20 18:36:45.235361648 +0000 UTC m=+398.157174672" lastFinishedPulling="2026-01-20 18:36:48.22650911 +0000 UTC m=+401.148322134" observedRunningTime="2026-01-20 18:36:50.295830642 +0000 UTC m=+403.217643666" watchObservedRunningTime="2026-01-20 18:36:50.29951701 +0000 UTC m=+403.221330024" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.133517 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.133813 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.184605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.324490 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.110344 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.110401 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.150919 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.334730 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:53 crc kubenswrapper[4773]: I0120 18:36:53.507255 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:53 crc kubenswrapper[4773]: I0120 18:36:53.507737 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:53 crc kubenswrapper[4773]: I0120 18:36:53.556016 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.342023 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.511816 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.511864 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.552611 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:55 crc kubenswrapper[4773]: I0120 18:36:55.346752 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.171391 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.171811 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.171879 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.172646 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.172719 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6" gracePeriod=600 Jan 20 18:37:00 crc kubenswrapper[4773]: I0120 18:37:00.333385 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6" exitCode=0 Jan 20 18:37:00 crc kubenswrapper[4773]: I0120 18:37:00.333442 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6"} Jan 20 18:37:00 crc kubenswrapper[4773]: I0120 18:37:00.333851 4773 scope.go:117] "RemoveContainer" containerID="3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24" Jan 20 18:37:01 crc kubenswrapper[4773]: I0120 18:37:01.340202 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92"} Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.285653 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" containerID="cri-o://cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" gracePeriod=30 Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.619102 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759326 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759405 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759484 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759510 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759585 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759756 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.760536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.760577 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.765240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9" (OuterVolumeSpecName: "kube-api-access-ss7d9") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "kube-api-access-ss7d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.766496 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.766757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.766987 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.772376 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.791543 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860646 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860686 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860698 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860711 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860722 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860735 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860745 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390484 4773 generic.go:334] "Generic (PLEG): container finished" podID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" exitCode=0 Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerDied","Data":"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa"} Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerDied","Data":"4717aabd05ca8421c098accb226b89152753529be1fa867b484287b5c5a81ae7"} Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390573 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390844 4773 scope.go:117] "RemoveContainer" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.422669 4773 scope.go:117] "RemoveContainer" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" Jan 20 18:37:11 crc kubenswrapper[4773]: E0120 18:37:11.423312 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa\": container with ID starting with cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa not found: ID does not exist" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.423361 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa"} err="failed to get container status \"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa\": rpc error: code = NotFound desc = could not find container \"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa\": container with ID starting with cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa not found: ID does not exist" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.435572 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.440004 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.457221 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" path="/var/lib/kubelet/pods/f751520b-bf3d-4226-8850-4b3346c43a6f/volumes" Jan 20 18:39:28 crc kubenswrapper[4773]: I0120 18:39:28.169995 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:39:28 crc kubenswrapper[4773]: I0120 18:39:28.170654 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:39:58 crc kubenswrapper[4773]: I0120 18:39:58.170607 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:39:58 crc kubenswrapper[4773]: I0120 18:39:58.171114 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:40:07 crc kubenswrapper[4773]: I0120 18:40:07.618612 4773 scope.go:117] "RemoveContainer" containerID="12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.170573 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.171680 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.171756 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.172639 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.172716 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92" gracePeriod=600 Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.501647 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92" exitCode=0 Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.501727 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92"} Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.502247 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce"} Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.502269 4773 scope.go:117] "RemoveContainer" containerID="6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.224813 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb"] Jan 20 18:41:43 crc kubenswrapper[4773]: E0120 18:41:43.225849 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.225865 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.226007 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.226480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.228222 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.233656 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.234355 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.234567 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bqkk8" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.275012 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-tgrsg"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.275675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.298195 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6g44h" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.308096 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cf2ql"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.308813 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.318332 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vv925" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.323924 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tgrsg"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.329395 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cf2ql"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.412164 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2bdj\" (UniqueName: \"kubernetes.io/projected/4380dd47-7110-43ea-af85-02675b558a8d-kube-api-access-p2bdj\") pod \"cert-manager-webhook-687f57d79b-cf2ql\" (UID: \"4380dd47-7110-43ea-af85-02675b558a8d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.412590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmnht\" (UniqueName: \"kubernetes.io/projected/5a2416cd-d7d8-4aa5-b7ef-1b61446a4072-kube-api-access-lmnht\") pod \"cert-manager-858654f9db-tgrsg\" (UID: \"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072\") " pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.412652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlt7q\" (UniqueName: \"kubernetes.io/projected/c249258b-878c-45b8-9886-6fee2afec18c-kube-api-access-dlt7q\") pod \"cert-manager-cainjector-cf98fcc89-mtkdb\" (UID: \"c249258b-878c-45b8-9886-6fee2afec18c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.513444 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmnht\" (UniqueName: \"kubernetes.io/projected/5a2416cd-d7d8-4aa5-b7ef-1b61446a4072-kube-api-access-lmnht\") pod \"cert-manager-858654f9db-tgrsg\" (UID: \"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072\") " pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.513696 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlt7q\" (UniqueName: \"kubernetes.io/projected/c249258b-878c-45b8-9886-6fee2afec18c-kube-api-access-dlt7q\") pod \"cert-manager-cainjector-cf98fcc89-mtkdb\" (UID: \"c249258b-878c-45b8-9886-6fee2afec18c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.513792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2bdj\" (UniqueName: \"kubernetes.io/projected/4380dd47-7110-43ea-af85-02675b558a8d-kube-api-access-p2bdj\") pod \"cert-manager-webhook-687f57d79b-cf2ql\" (UID: \"4380dd47-7110-43ea-af85-02675b558a8d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.532132 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2bdj\" (UniqueName: \"kubernetes.io/projected/4380dd47-7110-43ea-af85-02675b558a8d-kube-api-access-p2bdj\") pod \"cert-manager-webhook-687f57d79b-cf2ql\" (UID: \"4380dd47-7110-43ea-af85-02675b558a8d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.532206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlt7q\" (UniqueName: \"kubernetes.io/projected/c249258b-878c-45b8-9886-6fee2afec18c-kube-api-access-dlt7q\") pod \"cert-manager-cainjector-cf98fcc89-mtkdb\" (UID: \"c249258b-878c-45b8-9886-6fee2afec18c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.532904 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmnht\" (UniqueName: \"kubernetes.io/projected/5a2416cd-d7d8-4aa5-b7ef-1b61446a4072-kube-api-access-lmnht\") pod \"cert-manager-858654f9db-tgrsg\" (UID: \"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072\") " pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.589899 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.605043 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.626343 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.860050 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cf2ql"] Jan 20 18:41:43 crc kubenswrapper[4773]: W0120 18:41:43.865867 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4380dd47_7110_43ea_af85_02675b558a8d.slice/crio-46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58 WatchSource:0}: Error finding container 46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58: Status 404 returned error can't find the container with id 46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58 Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.868608 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.944732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" event={"ID":"4380dd47-7110-43ea-af85-02675b558a8d","Type":"ContainerStarted","Data":"46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58"} Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.005345 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb"] Jan 20 18:41:44 crc kubenswrapper[4773]: W0120 18:41:44.014116 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc249258b_878c_45b8_9886_6fee2afec18c.slice/crio-c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5 WatchSource:0}: Error finding container c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5: Status 404 returned error can't find the container with id c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5 Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.016735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tgrsg"] Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.953128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" event={"ID":"c249258b-878c-45b8-9886-6fee2afec18c","Type":"ContainerStarted","Data":"c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5"} Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.954831 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tgrsg" event={"ID":"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072","Type":"ContainerStarted","Data":"28cf9898e70669c725b9c2810417ee7dc210bc16a3009d39f1cdd8ab2d8bf58c"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.972082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" event={"ID":"c249258b-878c-45b8-9886-6fee2afec18c","Type":"ContainerStarted","Data":"9a30b9c50f7aeeaa6ae0d52604a4d4c8af3ce57e87e93619a72a5f2128e1afc8"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.976394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tgrsg" event={"ID":"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072","Type":"ContainerStarted","Data":"9cb2086f691a8de509ff9cbcdf82f2e873c82528627934d8778da4604d10c780"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.978068 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" event={"ID":"4380dd47-7110-43ea-af85-02675b558a8d","Type":"ContainerStarted","Data":"b33be19286c19c9f58023e7d1593d6a556295876b278ebd9597c889115a3ca9a"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.994323 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" podStartSLOduration=2.198339306 podStartE2EDuration="4.994290543s" podCreationTimestamp="2026-01-20 18:41:43 +0000 UTC" firstStartedPulling="2026-01-20 18:41:44.017682972 +0000 UTC m=+696.939495996" lastFinishedPulling="2026-01-20 18:41:46.813634209 +0000 UTC m=+699.735447233" observedRunningTime="2026-01-20 18:41:47.988233859 +0000 UTC m=+700.910046893" watchObservedRunningTime="2026-01-20 18:41:47.994290543 +0000 UTC m=+700.916103607" Jan 20 18:41:48 crc kubenswrapper[4773]: I0120 18:41:48.024365 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" podStartSLOduration=2.12748582 podStartE2EDuration="5.024345864s" podCreationTimestamp="2026-01-20 18:41:43 +0000 UTC" firstStartedPulling="2026-01-20 18:41:43.868249633 +0000 UTC m=+696.790062667" lastFinishedPulling="2026-01-20 18:41:46.765109687 +0000 UTC m=+699.686922711" observedRunningTime="2026-01-20 18:41:48.022708244 +0000 UTC m=+700.944521278" watchObservedRunningTime="2026-01-20 18:41:48.024345864 +0000 UTC m=+700.946158888" Jan 20 18:41:48 crc kubenswrapper[4773]: I0120 18:41:48.024873 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-tgrsg" podStartSLOduration=1.2577275700000001 podStartE2EDuration="5.024867186s" podCreationTimestamp="2026-01-20 18:41:43 +0000 UTC" firstStartedPulling="2026-01-20 18:41:44.025253523 +0000 UTC m=+696.947066547" lastFinishedPulling="2026-01-20 18:41:47.792393139 +0000 UTC m=+700.714206163" observedRunningTime="2026-01-20 18:41:48.009379355 +0000 UTC m=+700.931192379" watchObservedRunningTime="2026-01-20 18:41:48.024867186 +0000 UTC m=+700.946680210" Jan 20 18:41:48 crc kubenswrapper[4773]: I0120 18:41:48.627203 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:53 crc kubenswrapper[4773]: I0120 18:41:53.629116 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185451 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185908 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" containerID="cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185974 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185967 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" containerID="cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186053 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" containerID="cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186065 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" containerID="cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186050 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" containerID="cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186098 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" containerID="cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.279154 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" containerID="cri-o://5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0" gracePeriod=30 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.041419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.047019 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-acl-logging/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.047762 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-controller/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048675 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048718 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048728 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048736 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048745 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048741 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048753 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048762 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc" exitCode=143 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048771 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455" exitCode=143 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048776 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048788 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048807 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048816 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048825 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048833 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.050814 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.051379 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.051418 4773 generic.go:334] "Generic (PLEG): container finished" podID="061a607e-1868-4fcf-b3ea-d51157511d41" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" exitCode=2 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.051454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerDied","Data":"12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.052063 4773 scope.go:117] "RemoveContainer" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.052455 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bccxn_openshift-multus(061a607e-1868-4fcf-b3ea-d51157511d41)\"" pod="openshift-multus/multus-bccxn" podUID="061a607e-1868-4fcf-b3ea-d51157511d41" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.055921 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-acl-logging/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.056590 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-controller/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.057412 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.071770 4773 scope.go:117] "RemoveContainer" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166007 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8v62z"] Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166300 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166318 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166333 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166341 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166349 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166360 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166373 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166380 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166388 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166397 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166412 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166419 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166429 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166436 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166445 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166452 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166461 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166468 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166479 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166486 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166496 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166503 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166513 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166520 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166530 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kubecfg-setup" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166536 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kubecfg-setup" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166640 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166654 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166677 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166685 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166696 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166706 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166714 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166725 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166733 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166741 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166751 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166988 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.169001 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178073 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178170 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178190 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178274 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178282 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178311 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178306 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178399 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178390 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178437 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178457 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178488 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178505 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178525 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178549 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178562 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178579 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178607 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178627 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178655 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178684 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178773 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178783 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178792 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash" (OuterVolumeSpecName: "host-slash") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178840 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178845 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket" (OuterVolumeSpecName: "log-socket") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178858 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178853 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178869 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log" (OuterVolumeSpecName: "node-log") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179845 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179867 4773 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179879 4773 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179891 4773 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179902 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179913 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179924 4773 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179951 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179966 4773 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179977 4773 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179990 4773 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180004 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180016 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180027 4773 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180037 4773 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180049 4773 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180255 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.186092 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.194142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4" (OuterVolumeSpecName: "kube-api-access-9flh4") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "kube-api-access-9flh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.196492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281299 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-netd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovn-node-metrics-cert\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq5pj\" (UniqueName: \"kubernetes.io/projected/b635b367-c3be-4a4c-910d-e6806f1fa8c2-kube-api-access-lq5pj\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281723 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-log-socket\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281749 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-bin\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281774 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-systemd-units\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281800 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-netns\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-script-lib\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281847 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-kubelet\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281899 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-node-log\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-slash\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282032 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-ovn\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282071 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-var-lib-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282148 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-systemd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282176 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-config\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282253 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-etc-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282314 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-env-overrides\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282416 4773 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282434 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282447 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282457 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383541 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-env-overrides\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-netd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovn-node-metrics-cert\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq5pj\" (UniqueName: \"kubernetes.io/projected/b635b367-c3be-4a4c-910d-e6806f1fa8c2-kube-api-access-lq5pj\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383655 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-log-socket\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383677 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-bin\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-systemd-units\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383729 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-netns\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-script-lib\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383763 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-kubelet\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383754 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-node-log\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383839 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-netns\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-systemd-units\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-slash\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-ovn\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383983 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-log-socket\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-node-log\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384020 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-var-lib-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384038 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-slash\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-bin\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-ovn\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-var-lib-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384095 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-netd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-systemd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-systemd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384178 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-kubelet\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-config\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-etc-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384213 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-etc-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384378 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-env-overrides\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-script-lib\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384757 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-config\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.396619 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovn-node-metrics-cert\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.398711 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq5pj\" (UniqueName: \"kubernetes.io/projected/b635b367-c3be-4a4c-910d-e6806f1fa8c2-kube-api-access-lq5pj\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.484201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.056482 4773 generic.go:334] "Generic (PLEG): container finished" podID="b635b367-c3be-4a4c-910d-e6806f1fa8c2" containerID="5202eb208f6bcd0c12462bd64c86657a3a2e05f456081fc12024e21c61c6fafa" exitCode=0 Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.056578 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerDied","Data":"5202eb208f6bcd0c12462bd64c86657a3a2e05f456081fc12024e21c61c6fafa"} Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.056809 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"f841f89d19e69ccc55df86f1f23cba4354c74821862c09e5a2a2e721d71f0226"} Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.058418 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.067982 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-acl-logging/0.log" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.068547 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-controller/0.log" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.069280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"410001a1d3881fa68033cb522fb1036ff5be18d13872f61c3fe53b410c458aa8"} Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.069320 4773 scope.go:117] "RemoveContainer" containerID="5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.069467 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.088668 4773 scope.go:117] "RemoveContainer" containerID="68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.105790 4773 scope.go:117] "RemoveContainer" containerID="aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.122454 4773 scope.go:117] "RemoveContainer" containerID="8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.143124 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.146706 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.147095 4773 scope.go:117] "RemoveContainer" containerID="6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.164263 4773 scope.go:117] "RemoveContainer" containerID="a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.178161 4773 scope.go:117] "RemoveContainer" containerID="7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.195212 4773 scope.go:117] "RemoveContainer" containerID="dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.209292 4773 scope.go:117] "RemoveContainer" containerID="a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3" Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.078758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"a1d309f906242bc9d36f344d7cd7ef9ee8576929e5b70b0864734554998c1a8c"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"ff2809d43fc44c116c17c14d3d0e4adf9714d79f66c190c5dafc76abf6ba41fb"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"560aa7391debccc63ef5c53bd12cc1d3a266fb61c7cc61c6d7cea18bc76cfcbd"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"9716b731e54a2d3a0a72a9e9980c73e66c442d562912aad434720cbfdcb3a3a9"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079145 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"98c7c76a029003f426c8c93eb026a64780491014bbfe37fa8f6744b54b5bd91b"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"0b606de5b7543afaf650b56755321438f47d8b3b4562b6cf5debb0159790000e"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.458971 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" path="/var/lib/kubelet/pods/f354424d-7f22-42d6-8bd9-00e32e78c3d3/volumes" Jan 20 18:42:04 crc kubenswrapper[4773]: I0120 18:42:04.117920 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"7484afb2b06bc09ea22344323c2eeeb1b389047c6d94d3699e9241b7d5e1e71d"} Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.130847 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"a3a35f9675871c14221068e27003457dec3edf918e6a77e10b6e2b7174ad7c2a"} Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.131444 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.131463 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.157739 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" podStartSLOduration=7.15772332 podStartE2EDuration="7.15772332s" podCreationTimestamp="2026-01-20 18:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:42:06.155077267 +0000 UTC m=+719.076890321" watchObservedRunningTime="2026-01-20 18:42:06.15772332 +0000 UTC m=+719.079536344" Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.162605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:07 crc kubenswrapper[4773]: I0120 18:42:07.136616 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:07 crc kubenswrapper[4773]: I0120 18:42:07.160531 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:14 crc kubenswrapper[4773]: I0120 18:42:14.446732 4773 scope.go:117] "RemoveContainer" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" Jan 20 18:42:14 crc kubenswrapper[4773]: E0120 18:42:14.447785 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bccxn_openshift-multus(061a607e-1868-4fcf-b3ea-d51157511d41)\"" pod="openshift-multus/multus-bccxn" podUID="061a607e-1868-4fcf-b3ea-d51157511d41" Jan 20 18:42:28 crc kubenswrapper[4773]: I0120 18:42:28.170088 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:42:28 crc kubenswrapper[4773]: I0120 18:42:28.170666 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:42:29 crc kubenswrapper[4773]: I0120 18:42:29.447642 4773 scope.go:117] "RemoveContainer" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" Jan 20 18:42:29 crc kubenswrapper[4773]: I0120 18:42:29.519592 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:31 crc kubenswrapper[4773]: I0120 18:42:31.515088 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 18:42:31 crc kubenswrapper[4773]: I0120 18:42:31.515531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"c77ae071f927cd88aa64db8f3abf5d005c367450d23cbc29cab36cef26ece07f"} Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.944190 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd"] Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.951019 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.956134 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.957371 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.957467 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.957495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.975065 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd"] Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058149 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058213 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058780 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058781 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.077951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.297598 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.708861 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd"] Jan 20 18:42:36 crc kubenswrapper[4773]: I0120 18:42:36.549219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerStarted","Data":"02e86264329143bc774f750712bb4cf75f1a1d97aac372a64a9b23990a2955da"} Jan 20 18:42:36 crc kubenswrapper[4773]: I0120 18:42:36.549484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerStarted","Data":"d46197f82e8a656149ec058f82753003d495e34c37ab184b666a9f1929fd41f9"} Jan 20 18:42:38 crc kubenswrapper[4773]: I0120 18:42:38.560342 4773 generic.go:334] "Generic (PLEG): container finished" podID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerID="02e86264329143bc774f750712bb4cf75f1a1d97aac372a64a9b23990a2955da" exitCode=0 Jan 20 18:42:38 crc kubenswrapper[4773]: I0120 18:42:38.560536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"02e86264329143bc774f750712bb4cf75f1a1d97aac372a64a9b23990a2955da"} Jan 20 18:42:40 crc kubenswrapper[4773]: I0120 18:42:40.574525 4773 generic.go:334] "Generic (PLEG): container finished" podID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerID="de9628e801a8e3e5a8d60037c53c68268d9f50f78b2c45d59569034e60ec6a03" exitCode=0 Jan 20 18:42:40 crc kubenswrapper[4773]: I0120 18:42:40.574602 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"de9628e801a8e3e5a8d60037c53c68268d9f50f78b2c45d59569034e60ec6a03"} Jan 20 18:42:41 crc kubenswrapper[4773]: I0120 18:42:41.582803 4773 generic.go:334] "Generic (PLEG): container finished" podID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerID="915ffdd09d2d3d224217f7fd2081300d46578fa9c61fc8af3b6eb9d81b2e1d7c" exitCode=0 Jan 20 18:42:41 crc kubenswrapper[4773]: I0120 18:42:41.582857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"915ffdd09d2d3d224217f7fd2081300d46578fa9c61fc8af3b6eb9d81b2e1d7c"} Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.002172 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.003846 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.024834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.082562 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.082614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.082651 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.183829 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.183954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.183992 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.184366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.184808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.217236 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.319476 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.726113 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:42 crc kubenswrapper[4773]: W0120 18:42:42.741220 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6959fd3_1296_4909_b0c1_0803e1e7b098.slice/crio-e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a WatchSource:0}: Error finding container e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a: Status 404 returned error can't find the container with id e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.803102 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.996461 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.996517 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.996545 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.997512 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle" (OuterVolumeSpecName: "bundle") pod "7c79fd0a-1d41-44db-8ee4-d5781d77e848" (UID: "7c79fd0a-1d41-44db-8ee4-d5781d77e848"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.000804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7" (OuterVolumeSpecName: "kube-api-access-vvzr7") pod "7c79fd0a-1d41-44db-8ee4-d5781d77e848" (UID: "7c79fd0a-1d41-44db-8ee4-d5781d77e848"). InnerVolumeSpecName "kube-api-access-vvzr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.007512 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util" (OuterVolumeSpecName: "util") pod "7c79fd0a-1d41-44db-8ee4-d5781d77e848" (UID: "7c79fd0a-1d41-44db-8ee4-d5781d77e848"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.097760 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.098040 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.098146 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.538568 4773 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.594425 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerID="db6475baee2b330e6cee2961ede1d25c44221d7af154400d2652eb5c6d58e558" exitCode=0 Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.594481 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"db6475baee2b330e6cee2961ede1d25c44221d7af154400d2652eb5c6d58e558"} Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.594526 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerStarted","Data":"e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a"} Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.598569 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"d46197f82e8a656149ec058f82753003d495e34c37ab184b666a9f1929fd41f9"} Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.598615 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d46197f82e8a656149ec058f82753003d495e34c37ab184b666a9f1929fd41f9" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.598615 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.609430 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerID="5e754401cfd5accf3b020f80444b146055208f2c923e46a1f3deb9641bd7242a" exitCode=0 Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.609758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"5e754401cfd5accf3b020f80444b146055208f2c923e46a1f3deb9641bd7242a"} Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655614 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-bwjbw"] Jan 20 18:42:45 crc kubenswrapper[4773]: E0120 18:42:45.655863 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="extract" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655883 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="extract" Jan 20 18:42:45 crc kubenswrapper[4773]: E0120 18:42:45.655898 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="pull" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655907 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="pull" Jan 20 18:42:45 crc kubenswrapper[4773]: E0120 18:42:45.655949 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="util" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655961 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="util" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.656093 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="extract" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.656521 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.658395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.658618 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.659805 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6z7v9" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.670067 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-bwjbw"] Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.833402 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8x57\" (UniqueName: \"kubernetes.io/projected/a0e928f6-ac84-4903-ab0e-08557dea077f-kube-api-access-c8x57\") pod \"nmstate-operator-646758c888-bwjbw\" (UID: \"a0e928f6-ac84-4903-ab0e-08557dea077f\") " pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.935197 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8x57\" (UniqueName: \"kubernetes.io/projected/a0e928f6-ac84-4903-ab0e-08557dea077f-kube-api-access-c8x57\") pod \"nmstate-operator-646758c888-bwjbw\" (UID: \"a0e928f6-ac84-4903-ab0e-08557dea077f\") " pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.954063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8x57\" (UniqueName: \"kubernetes.io/projected/a0e928f6-ac84-4903-ab0e-08557dea077f-kube-api-access-c8x57\") pod \"nmstate-operator-646758c888-bwjbw\" (UID: \"a0e928f6-ac84-4903-ab0e-08557dea077f\") " pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.970785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.377992 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-bwjbw"] Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.616283 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerStarted","Data":"d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47"} Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.617143 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" event={"ID":"a0e928f6-ac84-4903-ab0e-08557dea077f","Type":"ContainerStarted","Data":"1bf72a9437e6bde709d32f7b15f7385fec46d8d0694cfb8db5733c191ce29ac8"} Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.633156 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4nm5" podStartSLOduration=3.198467814 podStartE2EDuration="5.633139823s" podCreationTimestamp="2026-01-20 18:42:41 +0000 UTC" firstStartedPulling="2026-01-20 18:42:43.596962348 +0000 UTC m=+756.518775392" lastFinishedPulling="2026-01-20 18:42:46.031634377 +0000 UTC m=+758.953447401" observedRunningTime="2026-01-20 18:42:46.632596281 +0000 UTC m=+759.554409305" watchObservedRunningTime="2026-01-20 18:42:46.633139823 +0000 UTC m=+759.554952847" Jan 20 18:42:48 crc kubenswrapper[4773]: I0120 18:42:48.627813 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" event={"ID":"a0e928f6-ac84-4903-ab0e-08557dea077f","Type":"ContainerStarted","Data":"72a178d0f105384856984ab8cbfb4affcf3963a8e0e1bd5f2c9da1e94f3016e8"} Jan 20 18:42:48 crc kubenswrapper[4773]: I0120 18:42:48.644067 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" podStartSLOduration=1.5989066250000001 podStartE2EDuration="3.644053941s" podCreationTimestamp="2026-01-20 18:42:45 +0000 UTC" firstStartedPulling="2026-01-20 18:42:46.382940307 +0000 UTC m=+759.304753331" lastFinishedPulling="2026-01-20 18:42:48.428087623 +0000 UTC m=+761.349900647" observedRunningTime="2026-01-20 18:42:48.641154402 +0000 UTC m=+761.562967426" watchObservedRunningTime="2026-01-20 18:42:48.644053941 +0000 UTC m=+761.565866965" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.319770 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.320305 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.368443 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.700808 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:53 crc kubenswrapper[4773]: I0120 18:42:53.192802 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:54 crc kubenswrapper[4773]: I0120 18:42:54.662415 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4nm5" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" containerID="cri-o://d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47" gracePeriod=2 Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.180955 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kxbnw"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.183434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.186525 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-d5t26" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.187768 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.188911 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.191063 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-q9h42"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.191223 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.191825 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.202226 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kxbnw"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.208735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.298608 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.299471 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.301052 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.301636 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.301883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jwqzz" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.341764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348796 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtc2\" (UniqueName: \"kubernetes.io/projected/ef435627-8918-4451-8d3a-23e494e29f56-kube-api-access-tgtc2\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348884 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-dbus-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348908 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-nmstate-lock\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9380b21a-b971-4bb9-9572-d795f171b941-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/42822a21-0834-4fc7-aab5-4dcdf46f2786-kube-api-access-cj7n4\") pod \"nmstate-metrics-54757c584b-kxbnw\" (UID: \"42822a21-0834-4fc7-aab5-4dcdf46f2786\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.349004 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-ovs-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.349024 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsz4\" (UniqueName: \"kubernetes.io/projected/9380b21a-b971-4bb9-9572-d795f171b941-kube-api-access-svsz4\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450172 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-ovs-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsz4\" (UniqueName: \"kubernetes.io/projected/9380b21a-b971-4bb9-9572-d795f171b941-kube-api-access-svsz4\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450263 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtc2\" (UniqueName: \"kubernetes.io/projected/ef435627-8918-4451-8d3a-23e494e29f56-kube-api-access-tgtc2\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450296 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/431c5397-9244-4083-9659-59210fd6d5c0-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/431c5397-9244-4083-9659-59210fd6d5c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450351 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-dbus-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450374 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-nmstate-lock\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450400 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skppb\" (UniqueName: \"kubernetes.io/projected/431c5397-9244-4083-9659-59210fd6d5c0-kube-api-access-skppb\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9380b21a-b971-4bb9-9572-d795f171b941-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/42822a21-0834-4fc7-aab5-4dcdf46f2786-kube-api-access-cj7n4\") pod \"nmstate-metrics-54757c584b-kxbnw\" (UID: \"42822a21-0834-4fc7-aab5-4dcdf46f2786\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-ovs-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.451104 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-dbus-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.451159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-nmstate-lock\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.457560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9380b21a-b971-4bb9-9572-d795f171b941-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.473349 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsz4\" (UniqueName: \"kubernetes.io/projected/9380b21a-b971-4bb9-9572-d795f171b941-kube-api-access-svsz4\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.473513 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/42822a21-0834-4fc7-aab5-4dcdf46f2786-kube-api-access-cj7n4\") pod \"nmstate-metrics-54757c584b-kxbnw\" (UID: \"42822a21-0834-4fc7-aab5-4dcdf46f2786\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.480150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtc2\" (UniqueName: \"kubernetes.io/projected/ef435627-8918-4451-8d3a-23e494e29f56-kube-api-access-tgtc2\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.500202 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d5d894b64-qtlgh"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.500924 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.507769 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.514014 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5d894b64-qtlgh"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.524587 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.529369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552039 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z66vg\" (UniqueName: \"kubernetes.io/projected/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-kube-api-access-z66vg\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/431c5397-9244-4083-9659-59210fd6d5c0-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/431c5397-9244-4083-9659-59210fd6d5c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-oauth-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-service-ca\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skppb\" (UniqueName: \"kubernetes.io/projected/431c5397-9244-4083-9659-59210fd6d5c0-kube-api-access-skppb\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552444 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-trusted-ca-bundle\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552492 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-oauth-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.553009 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/431c5397-9244-4083-9659-59210fd6d5c0-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.559543 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/431c5397-9244-4083-9659-59210fd6d5c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.571424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skppb\" (UniqueName: \"kubernetes.io/projected/431c5397-9244-4083-9659-59210fd6d5c0-kube-api-access-skppb\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.614403 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.653969 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654384 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z66vg\" (UniqueName: \"kubernetes.io/projected/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-kube-api-access-z66vg\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654451 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-oauth-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654470 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-service-ca\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-trusted-ca-bundle\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-oauth-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.655881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.655989 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-oauth-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.656650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-service-ca\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.657313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-trusted-ca-bundle\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.659203 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-oauth-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.660205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.669019 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q9h42" event={"ID":"ef435627-8918-4451-8d3a-23e494e29f56","Type":"ContainerStarted","Data":"05eb12614a2150580ce00e48bbfc5e9ca06a56577d6ceeeb8f48ddc46ca9d8f4"} Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.673905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z66vg\" (UniqueName: \"kubernetes.io/projected/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-kube-api-access-z66vg\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.830593 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.847431 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.912742 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kxbnw"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.980060 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc"] Jan 20 18:42:56 crc kubenswrapper[4773]: W0120 18:42:56.034609 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c9dff2_2e8c_4b93_8bd0_b9eacce8ac7d.slice/crio-bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d WatchSource:0}: Error finding container bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d: Status 404 returned error can't find the container with id bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.035786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5d894b64-qtlgh"] Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.675561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" event={"ID":"431c5397-9244-4083-9659-59210fd6d5c0","Type":"ContainerStarted","Data":"7661862e1288a88888ee9336fc2c0afe4f6c138e7cbb9fde241f41a6cbc23529"} Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.676548 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" event={"ID":"9380b21a-b971-4bb9-9572-d795f171b941","Type":"ContainerStarted","Data":"dec075386fc9aa0b4e343e74f81a008446f76bca5ac0548931ae79c2116d405d"} Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.677725 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5d894b64-qtlgh" event={"ID":"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d","Type":"ContainerStarted","Data":"bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d"} Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.678861 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" event={"ID":"42822a21-0834-4fc7-aab5-4dcdf46f2786","Type":"ContainerStarted","Data":"6c2149ba53eca7d401577818bf45432410d0c9927772528b9057f4e2713ab2a8"} Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.690522 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerID="d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47" exitCode=0 Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.690632 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47"} Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.694343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5d894b64-qtlgh" event={"ID":"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d","Type":"ContainerStarted","Data":"603f7f7853baaee0e82b5c825ecc6d119dc16435e5e1f7e5cd8652f387eca8f9"} Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.717743 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d5d894b64-qtlgh" podStartSLOduration=2.717724844 podStartE2EDuration="2.717724844s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:42:57.71502021 +0000 UTC m=+770.636833254" watchObservedRunningTime="2026-01-20 18:42:57.717724844 +0000 UTC m=+770.639537868" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.773827 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.884343 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"e6959fd3-1296-4909-b0c1-0803e1e7b098\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.884482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"e6959fd3-1296-4909-b0c1-0803e1e7b098\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.884546 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"e6959fd3-1296-4909-b0c1-0803e1e7b098\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.885452 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities" (OuterVolumeSpecName: "utilities") pod "e6959fd3-1296-4909-b0c1-0803e1e7b098" (UID: "e6959fd3-1296-4909-b0c1-0803e1e7b098"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.888808 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc" (OuterVolumeSpecName: "kube-api-access-bsnqc") pod "e6959fd3-1296-4909-b0c1-0803e1e7b098" (UID: "e6959fd3-1296-4909-b0c1-0803e1e7b098"). InnerVolumeSpecName "kube-api-access-bsnqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.985615 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.985650 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.030247 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6959fd3-1296-4909-b0c1-0803e1e7b098" (UID: "e6959fd3-1296-4909-b0c1-0803e1e7b098"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.086626 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.170278 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.170345 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.702097 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.702085 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a"} Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.702512 4773 scope.go:117] "RemoveContainer" containerID="d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.734565 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.741072 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.046737 4773 scope.go:117] "RemoveContainer" containerID="5e754401cfd5accf3b020f80444b146055208f2c923e46a1f3deb9641bd7242a" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.070179 4773 scope.go:117] "RemoveContainer" containerID="db6475baee2b330e6cee2961ede1d25c44221d7af154400d2652eb5c6d58e558" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.457086 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" path="/var/lib/kubelet/pods/e6959fd3-1296-4909-b0c1-0803e1e7b098/volumes" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.710719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" event={"ID":"9380b21a-b971-4bb9-9572-d795f171b941","Type":"ContainerStarted","Data":"3544b02447bb0e3a305cebd9506121e0a13091893d90d5bc64937d5a0a3db6f3"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.711542 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.712709 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q9h42" event={"ID":"ef435627-8918-4451-8d3a-23e494e29f56","Type":"ContainerStarted","Data":"551b3190f510a02c1f4d02f56e2f205e22a01e44f6c550b1d943e7464107671c"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.713264 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.714900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" event={"ID":"42822a21-0834-4fc7-aab5-4dcdf46f2786","Type":"ContainerStarted","Data":"22230080d84ea1a9bcc5237f0dacc857101432fdf7998c686a1871e0552adc0a"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.715921 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" event={"ID":"431c5397-9244-4083-9659-59210fd6d5c0","Type":"ContainerStarted","Data":"9c854106b14595e79d5cf713fe491262958c607c17153851a4b5f877fdcb1bc2"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.735523 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" podStartSLOduration=1.608427279 podStartE2EDuration="4.735507906s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.988254287 +0000 UTC m=+768.910067311" lastFinishedPulling="2026-01-20 18:42:59.115334914 +0000 UTC m=+772.037147938" observedRunningTime="2026-01-20 18:42:59.726829479 +0000 UTC m=+772.648642503" watchObservedRunningTime="2026-01-20 18:42:59.735507906 +0000 UTC m=+772.657320930" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.744073 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-q9h42" podStartSLOduration=1.254455136 podStartE2EDuration="4.74405208s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.561864064 +0000 UTC m=+768.483677078" lastFinishedPulling="2026-01-20 18:42:59.051461008 +0000 UTC m=+771.973274022" observedRunningTime="2026-01-20 18:42:59.742958974 +0000 UTC m=+772.664771998" watchObservedRunningTime="2026-01-20 18:42:59.74405208 +0000 UTC m=+772.665865104" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.756760 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" podStartSLOduration=1.5474426829999999 podStartE2EDuration="4.756745093s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.84053177 +0000 UTC m=+768.762344794" lastFinishedPulling="2026-01-20 18:42:59.04983418 +0000 UTC m=+771.971647204" observedRunningTime="2026-01-20 18:42:59.755147215 +0000 UTC m=+772.676960229" watchObservedRunningTime="2026-01-20 18:42:59.756745093 +0000 UTC m=+772.678558117" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.011481 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:00 crc kubenswrapper[4773]: E0120 18:43:00.012773 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-utilities" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.012910 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-utilities" Jan 20 18:43:00 crc kubenswrapper[4773]: E0120 18:43:00.013062 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-content" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.013073 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-content" Jan 20 18:43:00 crc kubenswrapper[4773]: E0120 18:43:00.013084 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.013367 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.013736 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.014921 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.026394 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.124307 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.124391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.124552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.225945 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.225997 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.226027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.226446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.226515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.244827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.341340 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.877399 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:01 crc kubenswrapper[4773]: I0120 18:43:01.732034 4773 generic.go:334] "Generic (PLEG): container finished" podID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerID="5edb3aa68f1546b3903d52b33cf75034d560372134352c11c040ad6717b2a19c" exitCode=0 Jan 20 18:43:01 crc kubenswrapper[4773]: I0120 18:43:01.732198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"5edb3aa68f1546b3903d52b33cf75034d560372134352c11c040ad6717b2a19c"} Jan 20 18:43:01 crc kubenswrapper[4773]: I0120 18:43:01.732620 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerStarted","Data":"ac9a6107ce857d0ef38abc81812d1f9b30bf91a78fba8fb115e05a1f967c57f9"} Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.740652 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" event={"ID":"42822a21-0834-4fc7-aab5-4dcdf46f2786","Type":"ContainerStarted","Data":"1c83f60df914c0960493a33cc8925d97dbd3fc76f7c8f08bb6e2882f04f67b64"} Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.743513 4773 generic.go:334] "Generic (PLEG): container finished" podID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerID="929dc14101950cbbe51e62cda565c93d871b4b5517fe1e6c6ee5b30d456b3539" exitCode=0 Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.743549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"929dc14101950cbbe51e62cda565c93d871b4b5517fe1e6c6ee5b30d456b3539"} Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.760382 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" podStartSLOduration=2.067545015 podStartE2EDuration="7.760364031s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.931489282 +0000 UTC m=+768.853302306" lastFinishedPulling="2026-01-20 18:43:01.624308298 +0000 UTC m=+774.546121322" observedRunningTime="2026-01-20 18:43:02.755360272 +0000 UTC m=+775.677173296" watchObservedRunningTime="2026-01-20 18:43:02.760364031 +0000 UTC m=+775.682177045" Jan 20 18:43:03 crc kubenswrapper[4773]: I0120 18:43:03.749621 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerStarted","Data":"c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97"} Jan 20 18:43:03 crc kubenswrapper[4773]: I0120 18:43:03.765879 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxf5z" podStartSLOduration=3.320281419 podStartE2EDuration="4.765856956s" podCreationTimestamp="2026-01-20 18:42:59 +0000 UTC" firstStartedPulling="2026-01-20 18:43:01.734465448 +0000 UTC m=+774.656278472" lastFinishedPulling="2026-01-20 18:43:03.180040985 +0000 UTC m=+776.101854009" observedRunningTime="2026-01-20 18:43:03.764134946 +0000 UTC m=+776.685947960" watchObservedRunningTime="2026-01-20 18:43:03.765856956 +0000 UTC m=+776.687669980" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.551013 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.848100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.848146 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.852099 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:06 crc kubenswrapper[4773]: I0120 18:43:06.766618 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:06 crc kubenswrapper[4773]: I0120 18:43:06.813979 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.238964 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.239970 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.251038 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.251926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.251996 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.252065 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353567 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353615 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.373200 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.560156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.945328 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:07 crc kubenswrapper[4773]: W0120 18:43:07.953622 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc61da58_a0a9_4c56_bd9f_d84ac0474556.slice/crio-267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea WatchSource:0}: Error finding container 267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea: Status 404 returned error can't find the container with id 267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea Jan 20 18:43:08 crc kubenswrapper[4773]: I0120 18:43:08.775071 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" exitCode=0 Jan 20 18:43:08 crc kubenswrapper[4773]: I0120 18:43:08.775126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17"} Jan 20 18:43:08 crc kubenswrapper[4773]: I0120 18:43:08.775156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerStarted","Data":"267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea"} Jan 20 18:43:09 crc kubenswrapper[4773]: I0120 18:43:09.781782 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" exitCode=0 Jan 20 18:43:09 crc kubenswrapper[4773]: I0120 18:43:09.782059 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7"} Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.341486 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.341722 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.389459 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.790595 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerStarted","Data":"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b"} Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.816494 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2hz4f" podStartSLOduration=2.4261868079999998 podStartE2EDuration="3.816472593s" podCreationTimestamp="2026-01-20 18:43:07 +0000 UTC" firstStartedPulling="2026-01-20 18:43:08.777410353 +0000 UTC m=+781.699223377" lastFinishedPulling="2026-01-20 18:43:10.167696128 +0000 UTC m=+783.089509162" observedRunningTime="2026-01-20 18:43:10.815414007 +0000 UTC m=+783.737227041" watchObservedRunningTime="2026-01-20 18:43:10.816472593 +0000 UTC m=+783.738285617" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.841794 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:12 crc kubenswrapper[4773]: I0120 18:43:12.622647 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:12 crc kubenswrapper[4773]: I0120 18:43:12.802043 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxf5z" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" containerID="cri-o://c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97" gracePeriod=2 Jan 20 18:43:13 crc kubenswrapper[4773]: I0120 18:43:13.808636 4773 generic.go:334] "Generic (PLEG): container finished" podID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerID="c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97" exitCode=0 Jan 20 18:43:13 crc kubenswrapper[4773]: I0120 18:43:13.808727 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97"} Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.288486 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.448476 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.448560 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.448622 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.449565 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities" (OuterVolumeSpecName: "utilities") pod "4c2883aa-bc8e-4893-807a-f32cbd1ff77d" (UID: "4c2883aa-bc8e-4893-807a-f32cbd1ff77d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.454114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897" (OuterVolumeSpecName: "kube-api-access-kp897") pod "4c2883aa-bc8e-4893-807a-f32cbd1ff77d" (UID: "4c2883aa-bc8e-4893-807a-f32cbd1ff77d"). InnerVolumeSpecName "kube-api-access-kp897". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.490681 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c2883aa-bc8e-4893-807a-f32cbd1ff77d" (UID: "4c2883aa-bc8e-4893-807a-f32cbd1ff77d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.551017 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.551307 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.551317 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.819435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"ac9a6107ce857d0ef38abc81812d1f9b30bf91a78fba8fb115e05a1f967c57f9"} Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.819515 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.820788 4773 scope.go:117] "RemoveContainer" containerID="c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.839612 4773 scope.go:117] "RemoveContainer" containerID="929dc14101950cbbe51e62cda565c93d871b4b5517fe1e6c6ee5b30d456b3539" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.850637 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.856638 4773 scope.go:117] "RemoveContainer" containerID="5edb3aa68f1546b3903d52b33cf75034d560372134352c11c040ad6717b2a19c" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.857134 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:15 crc kubenswrapper[4773]: I0120 18:43:15.462750 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" path="/var/lib/kubelet/pods/4c2883aa-bc8e-4893-807a-f32cbd1ff77d/volumes" Jan 20 18:43:15 crc kubenswrapper[4773]: I0120 18:43:15.531250 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.561137 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.561558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.604428 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.875493 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:18 crc kubenswrapper[4773]: I0120 18:43:18.822220 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:19 crc kubenswrapper[4773]: I0120 18:43:19.850014 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2hz4f" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" containerID="cri-o://daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" gracePeriod=2 Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.719969 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.801333 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.801388 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.801443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.802158 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities" (OuterVolumeSpecName: "utilities") pod "cc61da58-a0a9-4c56-bd9f-d84ac0474556" (UID: "cc61da58-a0a9-4c56-bd9f-d84ac0474556"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.806702 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq" (OuterVolumeSpecName: "kube-api-access-9r2vq") pod "cc61da58-a0a9-4c56-bd9f-d84ac0474556" (UID: "cc61da58-a0a9-4c56-bd9f-d84ac0474556"). InnerVolumeSpecName "kube-api-access-9r2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.825320 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc61da58-a0a9-4c56-bd9f-d84ac0474556" (UID: "cc61da58-a0a9-4c56-bd9f-d84ac0474556"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858366 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" exitCode=0 Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858448 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b"} Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea"} Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858521 4773 scope.go:117] "RemoveContainer" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858750 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.886137 4773 scope.go:117] "RemoveContainer" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.902589 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.902674 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.902695 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.903976 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.909971 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.919214 4773 scope.go:117] "RemoveContainer" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.936611 4773 scope.go:117] "RemoveContainer" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" Jan 20 18:43:20 crc kubenswrapper[4773]: E0120 18:43:20.937074 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b\": container with ID starting with daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b not found: ID does not exist" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937110 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b"} err="failed to get container status \"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b\": rpc error: code = NotFound desc = could not find container \"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b\": container with ID starting with daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b not found: ID does not exist" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937136 4773 scope.go:117] "RemoveContainer" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" Jan 20 18:43:20 crc kubenswrapper[4773]: E0120 18:43:20.937525 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7\": container with ID starting with a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7 not found: ID does not exist" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937548 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7"} err="failed to get container status \"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7\": rpc error: code = NotFound desc = could not find container \"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7\": container with ID starting with a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7 not found: ID does not exist" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937566 4773 scope.go:117] "RemoveContainer" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" Jan 20 18:43:20 crc kubenswrapper[4773]: E0120 18:43:20.938038 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17\": container with ID starting with b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17 not found: ID does not exist" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.938126 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17"} err="failed to get container status \"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17\": rpc error: code = NotFound desc = could not find container \"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17\": container with ID starting with b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17 not found: ID does not exist" Jan 20 18:43:21 crc kubenswrapper[4773]: I0120 18:43:21.457511 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" path="/var/lib/kubelet/pods/cc61da58-a0a9-4c56-bd9f-d84ac0474556/volumes" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.170192 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.170775 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.170821 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.171373 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.171421 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce" gracePeriod=600 Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.832227 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g"] Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.832967 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.832983 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.832992 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.832999 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833007 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833014 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833029 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833035 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833047 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833054 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833067 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833073 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833190 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833201 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.834039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.838469 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.842258 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g"] Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910048 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce" exitCode=0 Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910091 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce"} Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910463 4773 scope.go:117] "RemoveContainer" containerID="8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57"} Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.917490 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.917612 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.917960 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019130 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019751 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019836 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.047872 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.150625 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.369470 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g"] Jan 20 18:43:29 crc kubenswrapper[4773]: W0120 18:43:29.375023 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0b8536_2fc2_4203_a22f_a2dc29d0b737.slice/crio-f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd WatchSource:0}: Error finding container f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd: Status 404 returned error can't find the container with id f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.922500 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerID="b8c7934e89e94988867800ba6aa7353813716b12795088fc773b31014106afe8" exitCode=0 Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.922909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"b8c7934e89e94988867800ba6aa7353813716b12795088fc773b31014106afe8"} Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.922994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerStarted","Data":"f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd"} Jan 20 18:43:31 crc kubenswrapper[4773]: I0120 18:43:31.882391 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9nh6h" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" containerID="cri-o://9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" gracePeriod=15 Jan 20 18:43:31 crc kubenswrapper[4773]: I0120 18:43:31.935230 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerID="34f15b6ad0b93f7ef4e8ebecf6f99e7748713602c43c29180183e7a27121e6af" exitCode=0 Jan 20 18:43:31 crc kubenswrapper[4773]: I0120 18:43:31.935277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"34f15b6ad0b93f7ef4e8ebecf6f99e7748713602c43c29180183e7a27121e6af"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.269811 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9nh6h_ba3736bb-3d36-4a0c-91fa-85f410849312/console/0.log" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.270409 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462450 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462504 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462547 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463025 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463079 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463156 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463668 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca" (OuterVolumeSpecName: "service-ca") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463800 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config" (OuterVolumeSpecName: "console-config") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463756 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.468813 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.469248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.469291 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv" (OuterVolumeSpecName: "kube-api-access-8ddbv") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "kube-api-access-8ddbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565125 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565204 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565232 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565246 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565260 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565273 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565286 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.944360 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerID="44de06d7de234234dff221a8c874efef7482abe3382b295c53f3bd9b7efd8e03" exitCode=0 Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.944434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"44de06d7de234234dff221a8c874efef7482abe3382b295c53f3bd9b7efd8e03"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946370 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9nh6h_ba3736bb-3d36-4a0c-91fa-85f410849312/console/0.log" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946400 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" exitCode=2 Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerDied","Data":"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946448 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerDied","Data":"e7d6c34d2f903961f01ee5fdb975fd359e3c9fc20a9e900b7c3df54dd10fd2d7"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946465 4773 scope.go:117] "RemoveContainer" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946543 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.977295 4773 scope.go:117] "RemoveContainer" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" Jan 20 18:43:32 crc kubenswrapper[4773]: E0120 18:43:32.977907 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367\": container with ID starting with 9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367 not found: ID does not exist" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.978055 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367"} err="failed to get container status \"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367\": rpc error: code = NotFound desc = could not find container \"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367\": container with ID starting with 9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367 not found: ID does not exist" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.988080 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.992505 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:43:33 crc kubenswrapper[4773]: I0120 18:43:33.455348 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" path="/var/lib/kubelet/pods/ba3736bb-3d36-4a0c-91fa-85f410849312/volumes" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.161205 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.285839 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.286253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.286337 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.286948 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle" (OuterVolumeSpecName: "bundle") pod "8e0b8536-2fc2-4203-a22f-a2dc29d0b737" (UID: "8e0b8536-2fc2-4203-a22f-a2dc29d0b737"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.290426 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx" (OuterVolumeSpecName: "kube-api-access-hbmpx") pod "8e0b8536-2fc2-4203-a22f-a2dc29d0b737" (UID: "8e0b8536-2fc2-4203-a22f-a2dc29d0b737"). InnerVolumeSpecName "kube-api-access-hbmpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.300857 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util" (OuterVolumeSpecName: "util") pod "8e0b8536-2fc2-4203-a22f-a2dc29d0b737" (UID: "8e0b8536-2fc2-4203-a22f-a2dc29d0b737"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.386967 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.387229 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.387240 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.965434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd"} Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.965721 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.965561 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.235582 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm"] Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236181 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="util" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236196 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="util" Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236211 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="pull" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236218 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="pull" Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236235 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="extract" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236243 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="extract" Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236253 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236260 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236386 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="extract" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236396 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236855 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.238581 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.238615 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qnrhn" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.238860 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.239298 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.247702 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.249612 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm"] Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.265832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-apiservice-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.265906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26b2f\" (UniqueName: \"kubernetes.io/projected/a93bbf26-2683-4cf0-a45a-1639d6da4e01-kube-api-access-26b2f\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.266025 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-webhook-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.366693 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26b2f\" (UniqueName: \"kubernetes.io/projected/a93bbf26-2683-4cf0-a45a-1639d6da4e01-kube-api-access-26b2f\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.366755 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-webhook-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.366835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-apiservice-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.373396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-apiservice-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.378632 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-webhook-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.385361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26b2f\" (UniqueName: \"kubernetes.io/projected/a93bbf26-2683-4cf0-a45a-1639d6da4e01-kube-api-access-26b2f\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.488824 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x"] Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.489565 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.491767 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.491982 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bp4hd" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.493052 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.504869 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x"] Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.554854 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.671142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-webhook-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.671445 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.671504 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5n5\" (UniqueName: \"kubernetes.io/projected/cbba9cb2-22ec-4f8c-8550-f3a69901785c-kube-api-access-gt5n5\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.772394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-webhook-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.772456 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.772490 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5n5\" (UniqueName: \"kubernetes.io/projected/cbba9cb2-22ec-4f8c-8550-f3a69901785c-kube-api-access-gt5n5\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.779013 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-webhook-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.789235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5n5\" (UniqueName: \"kubernetes.io/projected/cbba9cb2-22ec-4f8c-8550-f3a69901785c-kube-api-access-gt5n5\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.801957 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.813262 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:45 crc kubenswrapper[4773]: I0120 18:43:45.032753 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm"] Jan 20 18:43:45 crc kubenswrapper[4773]: I0120 18:43:45.306694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x"] Jan 20 18:43:45 crc kubenswrapper[4773]: W0120 18:43:45.309646 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbba9cb2_22ec_4f8c_8550_f3a69901785c.slice/crio-f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a WatchSource:0}: Error finding container f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a: Status 404 returned error can't find the container with id f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a Jan 20 18:43:46 crc kubenswrapper[4773]: I0120 18:43:46.023259 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" event={"ID":"cbba9cb2-22ec-4f8c-8550-f3a69901785c","Type":"ContainerStarted","Data":"f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a"} Jan 20 18:43:46 crc kubenswrapper[4773]: I0120 18:43:46.024669 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" event={"ID":"a93bbf26-2683-4cf0-a45a-1639d6da4e01","Type":"ContainerStarted","Data":"1ff00c78a12dda61decf1f705cdae7b2aaaf9a2bd9a11bdb90f9209cd788f971"} Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.050695 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" event={"ID":"a93bbf26-2683-4cf0-a45a-1639d6da4e01","Type":"ContainerStarted","Data":"2842fe0fc789e8dc9af0a9e13ee0747e4a399fb4db9f2f5e9372832e1ad2d0d5"} Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.051053 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.053204 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" event={"ID":"cbba9cb2-22ec-4f8c-8550-f3a69901785c","Type":"ContainerStarted","Data":"38132df97442f0f2cfd9500866f7693c1ed2218bb81bc3780ddda9306fb6d716"} Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.053535 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.080238 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" podStartSLOduration=1.741248254 podStartE2EDuration="6.080217815s" podCreationTimestamp="2026-01-20 18:43:44 +0000 UTC" firstStartedPulling="2026-01-20 18:43:45.042164387 +0000 UTC m=+817.963977411" lastFinishedPulling="2026-01-20 18:43:49.381133948 +0000 UTC m=+822.302946972" observedRunningTime="2026-01-20 18:43:50.079311003 +0000 UTC m=+823.001124047" watchObservedRunningTime="2026-01-20 18:43:50.080217815 +0000 UTC m=+823.002030849" Jan 20 18:44:04 crc kubenswrapper[4773]: I0120 18:44:04.821311 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:44:04 crc kubenswrapper[4773]: I0120 18:44:04.854014 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" podStartSLOduration=16.763260217 podStartE2EDuration="20.853994708s" podCreationTimestamp="2026-01-20 18:43:44 +0000 UTC" firstStartedPulling="2026-01-20 18:43:45.312754591 +0000 UTC m=+818.234567615" lastFinishedPulling="2026-01-20 18:43:49.403489082 +0000 UTC m=+822.325302106" observedRunningTime="2026-01-20 18:43:50.098459901 +0000 UTC m=+823.020272935" watchObservedRunningTime="2026-01-20 18:44:04.853994708 +0000 UTC m=+837.775807732" Jan 20 18:44:24 crc kubenswrapper[4773]: I0120 18:44:24.557357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.228224 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.229417 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.231285 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.233270 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-t5r27" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.239344 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l2dcx"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.280676 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.289193 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.289592 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.339993 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.376961 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7xdr9"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.378232 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.388966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.389397 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.389328 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-stb8g" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.389690 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f52c7\" (UniqueName: \"kubernetes.io/projected/859ada1b-1a7b-4032-a974-2ec3571aa069-kube-api-access-f52c7\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411474 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg7mt\" (UniqueName: \"kubernetes.io/projected/05a83b70-ac51-4951-92c6-0f90265f2958-kube-api-access-kg7mt\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411529 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-reloader\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411553 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-conf\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411587 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a83b70-ac51-4951-92c6-0f90265f2958-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411632 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-sockets\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411690 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-startup\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.422107 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-2nprh"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.425382 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.430183 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.430225 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-2nprh"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512533 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-startup\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f52c7\" (UniqueName: \"kubernetes.io/projected/859ada1b-1a7b-4032-a974-2ec3571aa069-kube-api-access-f52c7\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512576 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg7mt\" (UniqueName: \"kubernetes.io/projected/05a83b70-ac51-4951-92c6-0f90265f2958-kube-api-access-kg7mt\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512597 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metallb-excludel2\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-reloader\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512649 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512666 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-conf\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512690 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512703 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a83b70-ac51-4951-92c6-0f90265f2958-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.512822 4773 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.512867 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs podName:859ada1b-1a7b-4032-a974-2ec3571aa069 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.012853121 +0000 UTC m=+858.934666145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs") pod "frr-k8s-l2dcx" (UID: "859ada1b-1a7b-4032-a974-2ec3571aa069") : secret "frr-k8s-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513639 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-startup\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-reloader\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513704 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rprmd\" (UniqueName: \"kubernetes.io/projected/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-kube-api-access-rprmd\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513761 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-sockets\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-sockets\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.514060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-conf\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.520328 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a83b70-ac51-4951-92c6-0f90265f2958-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.528874 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg7mt\" (UniqueName: \"kubernetes.io/projected/05a83b70-ac51-4951-92c6-0f90265f2958-kube-api-access-kg7mt\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.534758 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f52c7\" (UniqueName: \"kubernetes.io/projected/859ada1b-1a7b-4032-a974-2ec3571aa069-kube-api-access-f52c7\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614399 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metallb-excludel2\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-cert\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614632 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rprmd\" (UniqueName: \"kubernetes.io/projected/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-kube-api-access-rprmd\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614668 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdbt\" (UniqueName: \"kubernetes.io/projected/e284563a-e5f1-4c86-8100-863f86a7f7dc-kube-api-access-9wdbt\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.614688 4773 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.614767 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs podName:ce74c2a6-61b7-4fb4-883f-e86bf4b5c604 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.114747308 +0000 UTC m=+859.036560332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs") pod "speaker-7xdr9" (UID: "ce74c2a6-61b7-4fb4-883f-e86bf4b5c604") : secret "speaker-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.615206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metallb-excludel2\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.615833 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.616014 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist podName:ce74c2a6-61b7-4fb4-883f-e86bf4b5c604 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.115992198 +0000 UTC m=+859.037805222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist") pod "speaker-7xdr9" (UID: "ce74c2a6-61b7-4fb4-883f-e86bf4b5c604") : secret "metallb-memberlist" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.622092 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.637651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rprmd\" (UniqueName: \"kubernetes.io/projected/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-kube-api-access-rprmd\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.715814 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.716111 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-cert\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.716166 4773 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.716254 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs podName:e284563a-e5f1-4c86-8100-863f86a7f7dc nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.216229926 +0000 UTC m=+859.138043020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs") pod "controller-6968d8fdc4-2nprh" (UID: "e284563a-e5f1-4c86-8100-863f86a7f7dc") : secret "controller-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.716172 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdbt\" (UniqueName: \"kubernetes.io/projected/e284563a-e5f1-4c86-8100-863f86a7f7dc-kube-api-access-9wdbt\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.718459 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.730610 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-cert\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.733101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdbt\" (UniqueName: \"kubernetes.io/projected/e284563a-e5f1-4c86-8100-863f86a7f7dc-kube-api-access-9wdbt\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.849581 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz"] Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.018696 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.022580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.119844 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.119987 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:26 crc kubenswrapper[4773]: E0120 18:44:26.120139 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 20 18:44:26 crc kubenswrapper[4773]: E0120 18:44:26.120214 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist podName:ce74c2a6-61b7-4fb4-883f-e86bf4b5c604 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:27.12019657 +0000 UTC m=+860.042009594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist") pod "speaker-7xdr9" (UID: "ce74c2a6-61b7-4fb4-883f-e86bf4b5c604") : secret "metallb-memberlist" not found Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.124663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.220942 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.224498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.234106 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.290159 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" event={"ID":"05a83b70-ac51-4951-92c6-0f90265f2958","Type":"ContainerStarted","Data":"0e5b53f31f8de0b2dcb77f43d60fa89eba95be443948366d9e2b07bcb018e307"} Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.342812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.524796 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-2nprh"] Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.131132 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.137528 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.212528 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:27 crc kubenswrapper[4773]: W0120 18:44:27.231088 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce74c2a6_61b7_4fb4_883f_e86bf4b5c604.slice/crio-91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932 WatchSource:0}: Error finding container 91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932: Status 404 returned error can't find the container with id 91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932 Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2nprh" event={"ID":"e284563a-e5f1-4c86-8100-863f86a7f7dc","Type":"ContainerStarted","Data":"ddbbe016a23000596d2f746ae14a3dcee4dd29fa065d50df40ee7dfece2f95b5"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298701 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2nprh" event={"ID":"e284563a-e5f1-4c86-8100-863f86a7f7dc","Type":"ContainerStarted","Data":"ebbbd6518ebee81cf3355bc6794ea6a10bfa5776db0858b9d7313db90b4ba0fd"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298717 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2nprh" event={"ID":"e284563a-e5f1-4c86-8100-863f86a7f7dc","Type":"ContainerStarted","Data":"ee8f2a077f8dcbd844971c220ccb97d56367860ed80d8a7f150ee271db5b7ce1"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298754 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.301403 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xdr9" event={"ID":"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604","Type":"ContainerStarted","Data":"91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.303877 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"3341d93471f0c1295966cff7dd30eb227592a98bdc54cfa3029f751141d436b5"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.319142 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-2nprh" podStartSLOduration=2.319121045 podStartE2EDuration="2.319121045s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:44:27.315505519 +0000 UTC m=+860.237318563" watchObservedRunningTime="2026-01-20 18:44:27.319121045 +0000 UTC m=+860.240934079" Jan 20 18:44:28 crc kubenswrapper[4773]: I0120 18:44:28.317653 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xdr9" event={"ID":"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604","Type":"ContainerStarted","Data":"ce00d9899cdea324d159fab7c0f43030f425057fdb6604d480c2d38284d8ca51"} Jan 20 18:44:28 crc kubenswrapper[4773]: I0120 18:44:28.318042 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xdr9" event={"ID":"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604","Type":"ContainerStarted","Data":"af9d107c6c96c1737482efa3fee7c1f55427c31284f49869482d20bae615152e"} Jan 20 18:44:29 crc kubenswrapper[4773]: I0120 18:44:29.324188 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.346328 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.367196 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7xdr9" podStartSLOduration=11.367167011 podStartE2EDuration="11.367167011s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:44:28.333229885 +0000 UTC m=+861.255042919" watchObservedRunningTime="2026-01-20 18:44:36.367167011 +0000 UTC m=+869.288980055" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.370768 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" event={"ID":"05a83b70-ac51-4951-92c6-0f90265f2958","Type":"ContainerStarted","Data":"f23c5b5346cff941af97bdb2e657e059edf290c29fb99a43ba3a4ef0127ad738"} Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.370816 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.372607 4773 generic.go:334] "Generic (PLEG): container finished" podID="859ada1b-1a7b-4032-a974-2ec3571aa069" containerID="24be42e4f07a05c76c2370acb5fc7583453c39d07e696c13462480c42ac89128" exitCode=0 Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.372839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerDied","Data":"24be42e4f07a05c76c2370acb5fc7583453c39d07e696c13462480c42ac89128"} Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.392186 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" podStartSLOduration=1.8882776940000001 podStartE2EDuration="11.392165343s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="2026-01-20 18:44:25.85926128 +0000 UTC m=+858.781074304" lastFinishedPulling="2026-01-20 18:44:35.363148929 +0000 UTC m=+868.284961953" observedRunningTime="2026-01-20 18:44:36.389994472 +0000 UTC m=+869.311807496" watchObservedRunningTime="2026-01-20 18:44:36.392165343 +0000 UTC m=+869.313978367" Jan 20 18:44:37 crc kubenswrapper[4773]: I0120 18:44:37.216672 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:37 crc kubenswrapper[4773]: I0120 18:44:37.379850 4773 generic.go:334] "Generic (PLEG): container finished" podID="859ada1b-1a7b-4032-a974-2ec3571aa069" containerID="10a5291e038952eedd7f5950b9c7d6d2287f60283c0b83278e7b1ab35d31df53" exitCode=0 Jan 20 18:44:37 crc kubenswrapper[4773]: I0120 18:44:37.379895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerDied","Data":"10a5291e038952eedd7f5950b9c7d6d2287f60283c0b83278e7b1ab35d31df53"} Jan 20 18:44:38 crc kubenswrapper[4773]: I0120 18:44:38.387302 4773 generic.go:334] "Generic (PLEG): container finished" podID="859ada1b-1a7b-4032-a974-2ec3571aa069" containerID="fe4e1fe6ec01f7f1441ba1d107351c60710ba3dc737abceb56e8889409723eda" exitCode=0 Jan 20 18:44:38 crc kubenswrapper[4773]: I0120 18:44:38.387379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerDied","Data":"fe4e1fe6ec01f7f1441ba1d107351c60710ba3dc737abceb56e8889409723eda"} Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.397495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"3d0f0ec31622fd84028344d2f5896e32bbf84545cc61e5fdb0a52e6ec1da01f7"} Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.845504 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.846368 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.848547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-q6ctx" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.849713 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.849943 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.863759 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.037044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"openstack-operator-index-nbdm9\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.138683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"openstack-operator-index-nbdm9\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.157348 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"openstack-operator-index-nbdm9\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.164906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.638261 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:40 crc kubenswrapper[4773]: W0120 18:44:40.646278 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31a7f2a_24a2_429d_b654_9d87755f5812.slice/crio-18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27 WatchSource:0}: Error finding container 18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27: Status 404 returned error can't find the container with id 18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27 Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.424664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"9ca0d3a8e2cbd40a33d56516cee7c0779c2441cfa13711a674197f4c23b87316"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.425027 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"ba81d11f156fbd8d5277da65419d02dbe9369356c1857746cca5f15949b7c342"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.425044 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"c89abc383aa3f4fcf81697886b45bfc9a36fd36a3d93cf2314a2a27367653cd4"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.425054 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"202e627e34c3563d7cda61eb9aaab331de1087b86e0de9d10c0eb56feb80b0e6"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.426603 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerStarted","Data":"18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27"} Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.436263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"5080ec2ca4f81b6d3591ef2e22910c427a22346c4648de2af8f453c4e2553dd4"} Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.437228 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.458699 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l2dcx" podStartSLOduration=8.530480219 podStartE2EDuration="17.458683091s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="2026-01-20 18:44:26.406128624 +0000 UTC m=+859.327941648" lastFinishedPulling="2026-01-20 18:44:35.334331496 +0000 UTC m=+868.256144520" observedRunningTime="2026-01-20 18:44:42.458171799 +0000 UTC m=+875.379984843" watchObservedRunningTime="2026-01-20 18:44:42.458683091 +0000 UTC m=+875.380496115" Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.618800 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.226145 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zjdsq"] Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.227342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.240351 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zjdsq"] Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.377012 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqndz\" (UniqueName: \"kubernetes.io/projected/ba4d7dc5-ceca-4e4b-81af-9368937b7462-kube-api-access-vqndz\") pod \"openstack-operator-index-zjdsq\" (UID: \"ba4d7dc5-ceca-4e4b-81af-9368937b7462\") " pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.442889 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerStarted","Data":"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e"} Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.443227 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nbdm9" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" containerID="cri-o://af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" gracePeriod=2 Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.462459 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nbdm9" podStartSLOduration=2.492377816 podStartE2EDuration="4.462426925s" podCreationTimestamp="2026-01-20 18:44:39 +0000 UTC" firstStartedPulling="2026-01-20 18:44:40.648345361 +0000 UTC m=+873.570158385" lastFinishedPulling="2026-01-20 18:44:42.61839447 +0000 UTC m=+875.540207494" observedRunningTime="2026-01-20 18:44:43.459442925 +0000 UTC m=+876.381256029" watchObservedRunningTime="2026-01-20 18:44:43.462426925 +0000 UTC m=+876.384240039" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.478979 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqndz\" (UniqueName: \"kubernetes.io/projected/ba4d7dc5-ceca-4e4b-81af-9368937b7462-kube-api-access-vqndz\") pod \"openstack-operator-index-zjdsq\" (UID: \"ba4d7dc5-ceca-4e4b-81af-9368937b7462\") " pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.497925 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqndz\" (UniqueName: \"kubernetes.io/projected/ba4d7dc5-ceca-4e4b-81af-9368937b7462-kube-api-access-vqndz\") pod \"openstack-operator-index-zjdsq\" (UID: \"ba4d7dc5-ceca-4e4b-81af-9368937b7462\") " pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.549690 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.929658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zjdsq"] Jan 20 18:44:43 crc kubenswrapper[4773]: W0120 18:44:43.940819 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba4d7dc5_ceca_4e4b_81af_9368937b7462.slice/crio-b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e WatchSource:0}: Error finding container b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e: Status 404 returned error can't find the container with id b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.286783 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.392139 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"b31a7f2a-24a2-429d-b654-9d87755f5812\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.397996 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc" (OuterVolumeSpecName: "kube-api-access-n88pc") pod "b31a7f2a-24a2-429d-b654-9d87755f5812" (UID: "b31a7f2a-24a2-429d-b654-9d87755f5812"). InnerVolumeSpecName "kube-api-access-n88pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.449032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zjdsq" event={"ID":"ba4d7dc5-ceca-4e4b-81af-9368937b7462","Type":"ContainerStarted","Data":"31cb2d99855fb582c69bb88857d6f3426e1654fa86bcd739551bbc96d160eeb5"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.449073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zjdsq" event={"ID":"ba4d7dc5-ceca-4e4b-81af-9368937b7462","Type":"ContainerStarted","Data":"b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450570 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450550 4773 generic.go:334] "Generic (PLEG): container finished" podID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" exitCode=0 Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerDied","Data":"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450652 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerDied","Data":"18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450669 4773 scope.go:117] "RemoveContainer" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.469609 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zjdsq" podStartSLOduration=1.421358425 podStartE2EDuration="1.46959172s" podCreationTimestamp="2026-01-20 18:44:43 +0000 UTC" firstStartedPulling="2026-01-20 18:44:43.944707997 +0000 UTC m=+876.866521041" lastFinishedPulling="2026-01-20 18:44:43.992941292 +0000 UTC m=+876.914754336" observedRunningTime="2026-01-20 18:44:44.46411443 +0000 UTC m=+877.385927464" watchObservedRunningTime="2026-01-20 18:44:44.46959172 +0000 UTC m=+877.391404744" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.475590 4773 scope.go:117] "RemoveContainer" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" Jan 20 18:44:44 crc kubenswrapper[4773]: E0120 18:44:44.476054 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e\": container with ID starting with af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e not found: ID does not exist" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.476202 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e"} err="failed to get container status \"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e\": rpc error: code = NotFound desc = could not find container \"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e\": container with ID starting with af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e not found: ID does not exist" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.480815 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.484722 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.494012 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") on node \"crc\" DevicePath \"\"" Jan 20 18:44:45 crc kubenswrapper[4773]: I0120 18:44:45.458030 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" path="/var/lib/kubelet/pods/b31a7f2a-24a2-429d-b654-9d87755f5812/volumes" Jan 20 18:44:45 crc kubenswrapper[4773]: I0120 18:44:45.631441 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:46 crc kubenswrapper[4773]: I0120 18:44:46.234370 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:46 crc kubenswrapper[4773]: I0120 18:44:46.269733 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:53 crc kubenswrapper[4773]: I0120 18:44:53.551304 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:53 crc kubenswrapper[4773]: I0120 18:44:53.551665 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:53 crc kubenswrapper[4773]: I0120 18:44:53.575917 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:54 crc kubenswrapper[4773]: I0120 18:44:54.543994 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:56 crc kubenswrapper[4773]: I0120 18:44:56.237609 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.161661 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 18:45:00 crc kubenswrapper[4773]: E0120 18:45:00.162214 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.162230 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.162365 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.162761 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.165114 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.165213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.170098 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.301963 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.302014 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.302037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403005 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403050 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403076 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403847 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.409040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.419000 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.477585 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.881417 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.098079 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp"] Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.099792 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.104095 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hhr9t" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.143337 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp"] Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.219879 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.220488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.220533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.322570 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.322669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.322720 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.323646 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.323752 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.343395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.415454 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.570671 4773 generic.go:334] "Generic (PLEG): container finished" podID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerID="5831be469a4fe2b76e1bccd6344f54cbf800b2124dcc48460a5c9ae662bb240a" exitCode=0 Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.570717 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" event={"ID":"a10b40f1-a7af-4ef6-ac5d-104e09a494d9","Type":"ContainerDied","Data":"5831be469a4fe2b76e1bccd6344f54cbf800b2124dcc48460a5c9ae662bb240a"} Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.570777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" event={"ID":"a10b40f1-a7af-4ef6-ac5d-104e09a494d9","Type":"ContainerStarted","Data":"756c29baa988fe6135249d0d3ce51053c12085bbd50e5ebf33b9880861d8bf8c"} Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.641787 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp"] Jan 20 18:45:01 crc kubenswrapper[4773]: W0120 18:45:01.647170 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af94832_1f61_43d7_9c56_bee4b2893499.slice/crio-35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0 WatchSource:0}: Error finding container 35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0: Status 404 returned error can't find the container with id 35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0 Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.581824 4773 generic.go:334] "Generic (PLEG): container finished" podID="7af94832-1f61-43d7-9c56-bee4b2893499" containerID="80a792e8b4d2fb6110c2b6e18c2dfba91dd588be2f16ec01190c2c2dd2b34518" exitCode=0 Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.581944 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"80a792e8b4d2fb6110c2b6e18c2dfba91dd588be2f16ec01190c2c2dd2b34518"} Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.582281 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerStarted","Data":"35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0"} Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.816857 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.949920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.950016 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.950113 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.951023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume" (OuterVolumeSpecName: "config-volume") pod "a10b40f1-a7af-4ef6-ac5d-104e09a494d9" (UID: "a10b40f1-a7af-4ef6-ac5d-104e09a494d9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.955175 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a10b40f1-a7af-4ef6-ac5d-104e09a494d9" (UID: "a10b40f1-a7af-4ef6-ac5d-104e09a494d9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.955859 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g" (OuterVolumeSpecName: "kube-api-access-wzl9g") pod "a10b40f1-a7af-4ef6-ac5d-104e09a494d9" (UID: "a10b40f1-a7af-4ef6-ac5d-104e09a494d9"). InnerVolumeSpecName "kube-api-access-wzl9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.051571 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.051837 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.051924 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.590505 4773 generic.go:334] "Generic (PLEG): container finished" podID="7af94832-1f61-43d7-9c56-bee4b2893499" containerID="d6c65610baa679afefd44d4caeaf591a2ae640a1333da42d63ebea862cecba29" exitCode=0 Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.590582 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"d6c65610baa679afefd44d4caeaf591a2ae640a1333da42d63ebea862cecba29"} Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.592067 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" event={"ID":"a10b40f1-a7af-4ef6-ac5d-104e09a494d9","Type":"ContainerDied","Data":"756c29baa988fe6135249d0d3ce51053c12085bbd50e5ebf33b9880861d8bf8c"} Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.592351 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756c29baa988fe6135249d0d3ce51053c12085bbd50e5ebf33b9880861d8bf8c" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.592137 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:04 crc kubenswrapper[4773]: I0120 18:45:04.600392 4773 generic.go:334] "Generic (PLEG): container finished" podID="7af94832-1f61-43d7-9c56-bee4b2893499" containerID="9eb398dc71f25470e2af328a2029d77747e5efc30044ab80307ded8d3dd0cc4d" exitCode=0 Jan 20 18:45:04 crc kubenswrapper[4773]: I0120 18:45:04.600441 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"9eb398dc71f25470e2af328a2029d77747e5efc30044ab80307ded8d3dd0cc4d"} Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.851613 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.992373 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"7af94832-1f61-43d7-9c56-bee4b2893499\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.992457 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"7af94832-1f61-43d7-9c56-bee4b2893499\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.992495 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"7af94832-1f61-43d7-9c56-bee4b2893499\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.993345 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle" (OuterVolumeSpecName: "bundle") pod "7af94832-1f61-43d7-9c56-bee4b2893499" (UID: "7af94832-1f61-43d7-9c56-bee4b2893499"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.998917 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv" (OuterVolumeSpecName: "kube-api-access-7tkrv") pod "7af94832-1f61-43d7-9c56-bee4b2893499" (UID: "7af94832-1f61-43d7-9c56-bee4b2893499"). InnerVolumeSpecName "kube-api-access-7tkrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.006295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util" (OuterVolumeSpecName: "util") pod "7af94832-1f61-43d7-9c56-bee4b2893499" (UID: "7af94832-1f61-43d7-9c56-bee4b2893499"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.094140 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.094186 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.094195 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.613233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0"} Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.613487 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.613488 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.670441 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb"] Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671498 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="extract" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671513 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="extract" Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671527 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="pull" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671535 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="pull" Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671555 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="util" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671562 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="util" Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671579 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerName="collect-profiles" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671587 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerName="collect-profiles" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671903 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="extract" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671917 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerName="collect-profiles" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.672455 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.675210 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-kcdp2" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.697238 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb"] Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.700223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77g4q\" (UniqueName: \"kubernetes.io/projected/e2d598ad-b9fa-4874-8669-688e18171e82-kube-api-access-77g4q\") pod \"openstack-operator-controller-init-5df999bcf5-pztzb\" (UID: \"e2d598ad-b9fa-4874-8669-688e18171e82\") " pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.801778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77g4q\" (UniqueName: \"kubernetes.io/projected/e2d598ad-b9fa-4874-8669-688e18171e82-kube-api-access-77g4q\") pod \"openstack-operator-controller-init-5df999bcf5-pztzb\" (UID: \"e2d598ad-b9fa-4874-8669-688e18171e82\") " pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.819350 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77g4q\" (UniqueName: \"kubernetes.io/projected/e2d598ad-b9fa-4874-8669-688e18171e82-kube-api-access-77g4q\") pod \"openstack-operator-controller-init-5df999bcf5-pztzb\" (UID: \"e2d598ad-b9fa-4874-8669-688e18171e82\") " pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.990706 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:14 crc kubenswrapper[4773]: I0120 18:45:14.428507 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb"] Jan 20 18:45:14 crc kubenswrapper[4773]: I0120 18:45:14.668877 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" event={"ID":"e2d598ad-b9fa-4874-8669-688e18171e82","Type":"ContainerStarted","Data":"c42e2500d599fdfe3654d8d789dd7a2fedce4d30bedbec7ab2fd053c0a49e225"} Jan 20 18:45:19 crc kubenswrapper[4773]: I0120 18:45:19.721230 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" event={"ID":"e2d598ad-b9fa-4874-8669-688e18171e82","Type":"ContainerStarted","Data":"4c1017e21a7625c060953cf413a3f1b3e7cfa23d20bd2908dd0e6748c717ff7c"} Jan 20 18:45:19 crc kubenswrapper[4773]: I0120 18:45:19.721800 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:19 crc kubenswrapper[4773]: I0120 18:45:19.749377 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" podStartSLOduration=2.404513313 podStartE2EDuration="6.749359924s" podCreationTimestamp="2026-01-20 18:45:13 +0000 UTC" firstStartedPulling="2026-01-20 18:45:14.436480666 +0000 UTC m=+907.358293690" lastFinishedPulling="2026-01-20 18:45:18.781327277 +0000 UTC m=+911.703140301" observedRunningTime="2026-01-20 18:45:19.743210178 +0000 UTC m=+912.665023202" watchObservedRunningTime="2026-01-20 18:45:19.749359924 +0000 UTC m=+912.671172948" Jan 20 18:45:23 crc kubenswrapper[4773]: I0120 18:45:23.993850 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.753786 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.755203 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.767533 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.863879 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.863949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.864156 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965267 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965386 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965838 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.966002 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.987596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.072387 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.308846 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.772241 4773 generic.go:334] "Generic (PLEG): container finished" podID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" exitCode=0 Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.772346 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c"} Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.772411 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerStarted","Data":"e2e74be584b696f3c79864fc6b26b1d3bf64e57b5e05f544cf9d23da32d16275"} Jan 20 18:45:28 crc kubenswrapper[4773]: I0120 18:45:28.169801 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:45:28 crc kubenswrapper[4773]: I0120 18:45:28.170288 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:45:29 crc kubenswrapper[4773]: I0120 18:45:29.795958 4773 generic.go:334] "Generic (PLEG): container finished" podID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" exitCode=0 Jan 20 18:45:29 crc kubenswrapper[4773]: I0120 18:45:29.796011 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf"} Jan 20 18:45:30 crc kubenswrapper[4773]: I0120 18:45:30.804480 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerStarted","Data":"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9"} Jan 20 18:45:30 crc kubenswrapper[4773]: I0120 18:45:30.821639 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tz7cr" podStartSLOduration=2.15320677 podStartE2EDuration="5.821621924s" podCreationTimestamp="2026-01-20 18:45:25 +0000 UTC" firstStartedPulling="2026-01-20 18:45:26.774547956 +0000 UTC m=+919.696360980" lastFinishedPulling="2026-01-20 18:45:30.4429631 +0000 UTC m=+923.364776134" observedRunningTime="2026-01-20 18:45:30.81981886 +0000 UTC m=+923.741631904" watchObservedRunningTime="2026-01-20 18:45:30.821621924 +0000 UTC m=+923.743434948" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.074352 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.074870 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.111358 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.893598 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:38 crc kubenswrapper[4773]: I0120 18:45:38.349030 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:38 crc kubenswrapper[4773]: I0120 18:45:38.860993 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tz7cr" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" containerID="cri-o://e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" gracePeriod=2 Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.833972 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871329 4773 generic.go:334] "Generic (PLEG): container finished" podID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" exitCode=0 Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871408 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871409 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9"} Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871558 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"e2e74be584b696f3c79864fc6b26b1d3bf64e57b5e05f544cf9d23da32d16275"} Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871580 4773 scope.go:117] "RemoveContainer" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.898665 4773 scope.go:117] "RemoveContainer" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.923510 4773 scope.go:117] "RemoveContainer" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.942844 4773 scope.go:117] "RemoveContainer" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" Jan 20 18:45:39 crc kubenswrapper[4773]: E0120 18:45:39.943748 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9\": container with ID starting with e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9 not found: ID does not exist" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.943787 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9"} err="failed to get container status \"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9\": rpc error: code = NotFound desc = could not find container \"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9\": container with ID starting with e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9 not found: ID does not exist" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.943814 4773 scope.go:117] "RemoveContainer" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" Jan 20 18:45:39 crc kubenswrapper[4773]: E0120 18:45:39.944163 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf\": container with ID starting with dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf not found: ID does not exist" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.944192 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf"} err="failed to get container status \"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf\": rpc error: code = NotFound desc = could not find container \"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf\": container with ID starting with dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf not found: ID does not exist" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.944210 4773 scope.go:117] "RemoveContainer" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" Jan 20 18:45:39 crc kubenswrapper[4773]: E0120 18:45:39.944513 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c\": container with ID starting with 4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c not found: ID does not exist" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.944542 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c"} err="failed to get container status \"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c\": rpc error: code = NotFound desc = could not find container \"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c\": container with ID starting with 4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c not found: ID does not exist" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966024 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"0bf6917d-4c23-4e7c-8969-822309492cfb\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966116 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"0bf6917d-4c23-4e7c-8969-822309492cfb\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966161 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"0bf6917d-4c23-4e7c-8969-822309492cfb\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities" (OuterVolumeSpecName: "utilities") pod "0bf6917d-4c23-4e7c-8969-822309492cfb" (UID: "0bf6917d-4c23-4e7c-8969-822309492cfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.973429 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4" (OuterVolumeSpecName: "kube-api-access-njzb4") pod "0bf6917d-4c23-4e7c-8969-822309492cfb" (UID: "0bf6917d-4c23-4e7c-8969-822309492cfb"). InnerVolumeSpecName "kube-api-access-njzb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.025250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bf6917d-4c23-4e7c-8969-822309492cfb" (UID: "0bf6917d-4c23-4e7c-8969-822309492cfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.068576 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.068622 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.068637 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.210828 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.216195 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:41 crc kubenswrapper[4773]: I0120 18:45:41.456524 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" path="/var/lib/kubelet/pods/0bf6917d-4c23-4e7c-8969-822309492cfb/volumes" Jan 20 18:45:58 crc kubenswrapper[4773]: I0120 18:45:58.170304 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:45:58 crc kubenswrapper[4773]: I0120 18:45:58.170903 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700165 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc"] Jan 20 18:46:01 crc kubenswrapper[4773]: E0120 18:46:01.700808 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700829 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" Jan 20 18:46:01 crc kubenswrapper[4773]: E0120 18:46:01.700861 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-content" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700871 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-content" Jan 20 18:46:01 crc kubenswrapper[4773]: E0120 18:46:01.700897 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-utilities" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700908 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-utilities" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.701101 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.701601 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.703396 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-45r8r" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.706894 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.707995 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.711551 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2j2hv" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.719474 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.724651 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.733196 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.734288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.737331 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5khm4" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.749628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.754142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqbnd\" (UniqueName: \"kubernetes.io/projected/df2d6d5b-b964-4672-903f-563b7792ee43-kube-api-access-bqbnd\") pod \"barbican-operator-controller-manager-7ddb5c749-hhxlp\" (UID: \"df2d6d5b-b964-4672-903f-563b7792ee43\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.754209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6fs\" (UniqueName: \"kubernetes.io/projected/48aacb32-c120-4f36-898b-60f5d01c5510-kube-api-access-mf6fs\") pod \"cinder-operator-controller-manager-9b68f5989-xmljc\" (UID: \"48aacb32-c120-4f36-898b-60f5d01c5510\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.754254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vrnm\" (UniqueName: \"kubernetes.io/projected/ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7-kube-api-access-6vrnm\") pod \"designate-operator-controller-manager-9f958b845-v4q7f\" (UID: \"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.768125 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.768850 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.771708 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-r8n8d" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.781738 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.782827 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.786783 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qn28m" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.796299 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.799304 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.819619 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.830479 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.839524 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nx6wx" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.855387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vrnm\" (UniqueName: \"kubernetes.io/projected/ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7-kube-api-access-6vrnm\") pod \"designate-operator-controller-manager-9f958b845-v4q7f\" (UID: \"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.855446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqbnd\" (UniqueName: \"kubernetes.io/projected/df2d6d5b-b964-4672-903f-563b7792ee43-kube-api-access-bqbnd\") pod \"barbican-operator-controller-manager-7ddb5c749-hhxlp\" (UID: \"df2d6d5b-b964-4672-903f-563b7792ee43\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.855503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6fs\" (UniqueName: \"kubernetes.io/projected/48aacb32-c120-4f36-898b-60f5d01c5510-kube-api-access-mf6fs\") pod \"cinder-operator-controller-manager-9b68f5989-xmljc\" (UID: \"48aacb32-c120-4f36-898b-60f5d01c5510\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.862011 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.873972 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.874775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.890783 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6fs\" (UniqueName: \"kubernetes.io/projected/48aacb32-c120-4f36-898b-60f5d01c5510-kube-api-access-mf6fs\") pod \"cinder-operator-controller-manager-9b68f5989-xmljc\" (UID: \"48aacb32-c120-4f36-898b-60f5d01c5510\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.892674 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hr4gt" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.892839 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.892971 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.894891 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.906651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqbnd\" (UniqueName: \"kubernetes.io/projected/df2d6d5b-b964-4672-903f-563b7792ee43-kube-api-access-bqbnd\") pod \"barbican-operator-controller-manager-7ddb5c749-hhxlp\" (UID: \"df2d6d5b-b964-4672-903f-563b7792ee43\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.908373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vrnm\" (UniqueName: \"kubernetes.io/projected/ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7-kube-api-access-6vrnm\") pod \"designate-operator-controller-manager-9f958b845-v4q7f\" (UID: \"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.913622 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.917877 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.923920 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-zb5cc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.924713 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gvm5n" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnx2r\" (UniqueName: \"kubernetes.io/projected/d1051db2-8914-422b-a126-5cd8ee078767-kube-api-access-tnx2r\") pod \"horizon-operator-controller-manager-77d5c5b54f-blxqv\" (UID: \"d1051db2-8914-422b-a126-5cd8ee078767\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7cd8\" (UniqueName: \"kubernetes.io/projected/951d4f5c-5d89-41c6-be8a-9828b05ce182-kube-api-access-k7cd8\") pod \"ironic-operator-controller-manager-78757b4889-2nhdr\" (UID: \"951d4f5c-5d89-41c6-be8a-9828b05ce182\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958850 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgwd\" (UniqueName: \"kubernetes.io/projected/437cadd4-5809-4b9e-afa2-05832cd6c303-kube-api-access-tfgwd\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958897 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzzl\" (UniqueName: \"kubernetes.io/projected/b773ecb8-3505-44ad-a28f-bd4054263888-kube-api-access-pbzzl\") pod \"keystone-operator-controller-manager-767fdc4f47-8tsjs\" (UID: \"b773ecb8-3505-44ad-a28f-bd4054263888\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958941 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkjc\" (UniqueName: \"kubernetes.io/projected/a570d5a5-53f4-444f-a14d-92ea24f27e2e-kube-api-access-htkjc\") pod \"heat-operator-controller-manager-594c8c9d5d-vjfdq\" (UID: \"a570d5a5-53f4-444f-a14d-92ea24f27e2e\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958986 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdzrg\" (UniqueName: \"kubernetes.io/projected/4604c39e-62d8-4420-b2bc-54d44f4ebcd0-kube-api-access-fdzrg\") pod \"glance-operator-controller-manager-c6994669c-4kk2r\" (UID: \"4604c39e-62d8-4420-b2bc-54d44f4ebcd0\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.975026 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.979297 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.987270 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.989622 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.990755 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.994981 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.009399 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-j66lb" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.022750 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.037460 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.052239 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.052740 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.053998 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnx2r\" (UniqueName: \"kubernetes.io/projected/d1051db2-8914-422b-a126-5cd8ee078767-kube-api-access-tnx2r\") pod \"horizon-operator-controller-manager-77d5c5b54f-blxqv\" (UID: \"d1051db2-8914-422b-a126-5cd8ee078767\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061097 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7cd8\" (UniqueName: \"kubernetes.io/projected/951d4f5c-5d89-41c6-be8a-9828b05ce182-kube-api-access-k7cd8\") pod \"ironic-operator-controller-manager-78757b4889-2nhdr\" (UID: \"951d4f5c-5d89-41c6-be8a-9828b05ce182\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgwd\" (UniqueName: \"kubernetes.io/projected/437cadd4-5809-4b9e-afa2-05832cd6c303-kube-api-access-tfgwd\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzzl\" (UniqueName: \"kubernetes.io/projected/b773ecb8-3505-44ad-a28f-bd4054263888-kube-api-access-pbzzl\") pod \"keystone-operator-controller-manager-767fdc4f47-8tsjs\" (UID: \"b773ecb8-3505-44ad-a28f-bd4054263888\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.061862 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.061912 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:02.56189654 +0000 UTC m=+955.483709564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.062025 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htkjc\" (UniqueName: \"kubernetes.io/projected/a570d5a5-53f4-444f-a14d-92ea24f27e2e-kube-api-access-htkjc\") pod \"heat-operator-controller-manager-594c8c9d5d-vjfdq\" (UID: \"a570d5a5-53f4-444f-a14d-92ea24f27e2e\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.062114 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdzrg\" (UniqueName: \"kubernetes.io/projected/4604c39e-62d8-4420-b2bc-54d44f4ebcd0-kube-api-access-fdzrg\") pod \"glance-operator-controller-manager-c6994669c-4kk2r\" (UID: \"4604c39e-62d8-4420-b2bc-54d44f4ebcd0\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.068271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dbxds" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.083112 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.084060 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.085527 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzzl\" (UniqueName: \"kubernetes.io/projected/b773ecb8-3505-44ad-a28f-bd4054263888-kube-api-access-pbzzl\") pod \"keystone-operator-controller-manager-767fdc4f47-8tsjs\" (UID: \"b773ecb8-3505-44ad-a28f-bd4054263888\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.089779 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-plc9j" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.095725 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkjc\" (UniqueName: \"kubernetes.io/projected/a570d5a5-53f4-444f-a14d-92ea24f27e2e-kube-api-access-htkjc\") pod \"heat-operator-controller-manager-594c8c9d5d-vjfdq\" (UID: \"a570d5a5-53f4-444f-a14d-92ea24f27e2e\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.097555 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdzrg\" (UniqueName: \"kubernetes.io/projected/4604c39e-62d8-4420-b2bc-54d44f4ebcd0-kube-api-access-fdzrg\") pod \"glance-operator-controller-manager-c6994669c-4kk2r\" (UID: \"4604c39e-62d8-4420-b2bc-54d44f4ebcd0\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.100118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7cd8\" (UniqueName: \"kubernetes.io/projected/951d4f5c-5d89-41c6-be8a-9828b05ce182-kube-api-access-k7cd8\") pod \"ironic-operator-controller-manager-78757b4889-2nhdr\" (UID: \"951d4f5c-5d89-41c6-be8a-9828b05ce182\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.109125 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.118009 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.126014 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnx2r\" (UniqueName: \"kubernetes.io/projected/d1051db2-8914-422b-a126-5cd8ee078767-kube-api-access-tnx2r\") pod \"horizon-operator-controller-manager-77d5c5b54f-blxqv\" (UID: \"d1051db2-8914-422b-a126-5cd8ee078767\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.126098 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.148202 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgwd\" (UniqueName: \"kubernetes.io/projected/437cadd4-5809-4b9e-afa2-05832cd6c303-kube-api-access-tfgwd\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.159879 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.169497 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbpd\" (UniqueName: \"kubernetes.io/projected/ed6d3389-b374-42a6-8101-1d34df737170-kube-api-access-ncbpd\") pod \"manila-operator-controller-manager-864f6b75bf-mqjmm\" (UID: \"ed6d3389-b374-42a6-8101-1d34df737170\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.169603 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm7vp\" (UniqueName: \"kubernetes.io/projected/8f795216-0196-4a5a-bfdf-20dee1543b43-kube-api-access-cm7vp\") pod \"mariadb-operator-controller-manager-c87fff755-s7scg\" (UID: \"8f795216-0196-4a5a-bfdf-20dee1543b43\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.192111 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.195250 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.203130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-h7nkq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.207068 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-prhbl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.208213 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.229846 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-85qwb" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.244596 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.248495 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-prhbl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271463 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nm8\" (UniqueName: \"kubernetes.io/projected/ff53e5c0-255a-43c5-a27c-ce9dc3145999-kube-api-access-n9nm8\") pod \"octavia-operator-controller-manager-7fc9b76cf6-sslnl\" (UID: \"ff53e5c0-255a-43c5-a27c-ce9dc3145999\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271764 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm7vp\" (UniqueName: \"kubernetes.io/projected/8f795216-0196-4a5a-bfdf-20dee1543b43-kube-api-access-cm7vp\") pod \"mariadb-operator-controller-manager-c87fff755-s7scg\" (UID: \"8f795216-0196-4a5a-bfdf-20dee1543b43\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271794 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52zcv\" (UniqueName: \"kubernetes.io/projected/b196e443-f058-49c2-b54b-a18656415f5a-kube-api-access-52zcv\") pod \"neutron-operator-controller-manager-cb4666565-hfwzv\" (UID: \"b196e443-f058-49c2-b54b-a18656415f5a\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwk42\" (UniqueName: \"kubernetes.io/projected/fb5406b5-d194-441a-a098-7ecdc7831ec1-kube-api-access-zwk42\") pod \"nova-operator-controller-manager-65849867d6-prhbl\" (UID: \"fb5406b5-d194-441a-a098-7ecdc7831ec1\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbpd\" (UniqueName: \"kubernetes.io/projected/ed6d3389-b374-42a6-8101-1d34df737170-kube-api-access-ncbpd\") pod \"manila-operator-controller-manager-864f6b75bf-mqjmm\" (UID: \"ed6d3389-b374-42a6-8101-1d34df737170\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.293049 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.308542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm7vp\" (UniqueName: \"kubernetes.io/projected/8f795216-0196-4a5a-bfdf-20dee1543b43-kube-api-access-cm7vp\") pod \"mariadb-operator-controller-manager-c87fff755-s7scg\" (UID: \"8f795216-0196-4a5a-bfdf-20dee1543b43\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.315002 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.329997 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.331420 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.334958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbpd\" (UniqueName: \"kubernetes.io/projected/ed6d3389-b374-42a6-8101-1d34df737170-kube-api-access-ncbpd\") pod \"manila-operator-controller-manager-864f6b75bf-mqjmm\" (UID: \"ed6d3389-b374-42a6-8101-1d34df737170\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.341336 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tspl4" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.356527 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.357475 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.364747 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-t49mz" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.364938 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.378309 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nm8\" (UniqueName: \"kubernetes.io/projected/ff53e5c0-255a-43c5-a27c-ce9dc3145999-kube-api-access-n9nm8\") pod \"octavia-operator-controller-manager-7fc9b76cf6-sslnl\" (UID: \"ff53e5c0-255a-43c5-a27c-ce9dc3145999\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.378430 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52zcv\" (UniqueName: \"kubernetes.io/projected/b196e443-f058-49c2-b54b-a18656415f5a-kube-api-access-52zcv\") pod \"neutron-operator-controller-manager-cb4666565-hfwzv\" (UID: \"b196e443-f058-49c2-b54b-a18656415f5a\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.378453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwk42\" (UniqueName: \"kubernetes.io/projected/fb5406b5-d194-441a-a098-7ecdc7831ec1-kube-api-access-zwk42\") pod \"nova-operator-controller-manager-65849867d6-prhbl\" (UID: \"fb5406b5-d194-441a-a098-7ecdc7831ec1\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.382379 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.383858 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.390714 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-25cgj" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.398654 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.400887 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.406889 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.425089 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.434090 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nm8\" (UniqueName: \"kubernetes.io/projected/ff53e5c0-255a-43c5-a27c-ce9dc3145999-kube-api-access-n9nm8\") pod \"octavia-operator-controller-manager-7fc9b76cf6-sslnl\" (UID: \"ff53e5c0-255a-43c5-a27c-ce9dc3145999\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.434540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52zcv\" (UniqueName: \"kubernetes.io/projected/b196e443-f058-49c2-b54b-a18656415f5a-kube-api-access-52zcv\") pod \"neutron-operator-controller-manager-cb4666565-hfwzv\" (UID: \"b196e443-f058-49c2-b54b-a18656415f5a\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.435910 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.436764 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.443362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwk42\" (UniqueName: \"kubernetes.io/projected/fb5406b5-d194-441a-a098-7ecdc7831ec1-kube-api-access-zwk42\") pod \"nova-operator-controller-manager-65849867d6-prhbl\" (UID: \"fb5406b5-d194-441a-a098-7ecdc7831ec1\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.446658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.462206 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.463528 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.466864 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hx58c" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.476475 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hd9xt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482183 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccx9m\" (UniqueName: \"kubernetes.io/projected/7ed73202-faba-46ba-ae91-8cd9ffbe70a4-kube-api-access-ccx9m\") pod \"telemetry-operator-controller-manager-5f8f495fcf-2thqw\" (UID: \"7ed73202-faba-46ba-ae91-8cd9ffbe70a4\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9f7z\" (UniqueName: \"kubernetes.io/projected/a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3-kube-api-access-q9f7z\") pod \"ovn-operator-controller-manager-55db956ddc-6ngwx\" (UID: \"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482241 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqfd\" (UniqueName: \"kubernetes.io/projected/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-kube-api-access-fjqfd\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482256 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/9e235ee6-33ad-40e3-9b7a-914820315627-kube-api-access-5bxh5\") pod \"swift-operator-controller-manager-85dd56d4cc-t8tmg\" (UID: \"9e235ee6-33ad-40e3-9b7a-914820315627\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxc76\" (UniqueName: \"kubernetes.io/projected/2601732b-921a-4c55-821b-0fc994c50236-kube-api-access-xxc76\") pod \"placement-operator-controller-manager-686df47fcb-26j8t\" (UID: \"2601732b-921a-4c55-821b-0fc994c50236\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.493447 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.494301 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.499591 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.500311 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.505357 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5qzhx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.530278 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.546460 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583827 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583919 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccx9m\" (UniqueName: \"kubernetes.io/projected/7ed73202-faba-46ba-ae91-8cd9ffbe70a4-kube-api-access-ccx9m\") pod \"telemetry-operator-controller-manager-5f8f495fcf-2thqw\" (UID: \"7ed73202-faba-46ba-ae91-8cd9ffbe70a4\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583986 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9f7z\" (UniqueName: \"kubernetes.io/projected/a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3-kube-api-access-q9f7z\") pod \"ovn-operator-controller-manager-55db956ddc-6ngwx\" (UID: \"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.584022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqfd\" (UniqueName: \"kubernetes.io/projected/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-kube-api-access-fjqfd\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.584040 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/9e235ee6-33ad-40e3-9b7a-914820315627-kube-api-access-5bxh5\") pod \"swift-operator-controller-manager-85dd56d4cc-t8tmg\" (UID: \"9e235ee6-33ad-40e3-9b7a-914820315627\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.584068 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxc76\" (UniqueName: \"kubernetes.io/projected/2601732b-921a-4c55-821b-0fc994c50236-kube-api-access-xxc76\") pod \"placement-operator-controller-manager-686df47fcb-26j8t\" (UID: \"2601732b-921a-4c55-821b-0fc994c50236\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.584923 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.584981 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.084968149 +0000 UTC m=+956.006781173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.585119 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.585148 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.585140244 +0000 UTC m=+956.506953268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.642719 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.652265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/9e235ee6-33ad-40e3-9b7a-914820315627-kube-api-access-5bxh5\") pod \"swift-operator-controller-manager-85dd56d4cc-t8tmg\" (UID: \"9e235ee6-33ad-40e3-9b7a-914820315627\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.657594 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxc76\" (UniqueName: \"kubernetes.io/projected/2601732b-921a-4c55-821b-0fc994c50236-kube-api-access-xxc76\") pod \"placement-operator-controller-manager-686df47fcb-26j8t\" (UID: \"2601732b-921a-4c55-821b-0fc994c50236\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.663578 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9f7z\" (UniqueName: \"kubernetes.io/projected/a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3-kube-api-access-q9f7z\") pod \"ovn-operator-controller-manager-55db956ddc-6ngwx\" (UID: \"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.665580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqfd\" (UniqueName: \"kubernetes.io/projected/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-kube-api-access-fjqfd\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.672617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccx9m\" (UniqueName: \"kubernetes.io/projected/7ed73202-faba-46ba-ae91-8cd9ffbe70a4-kube-api-access-ccx9m\") pod \"telemetry-operator-controller-manager-5f8f495fcf-2thqw\" (UID: \"7ed73202-faba-46ba-ae91-8cd9ffbe70a4\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.684735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.685721 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhn7\" (UniqueName: \"kubernetes.io/projected/cfba823f-e85e-42ae-aa8a-7926cc906b92-kube-api-access-lwhn7\") pod \"test-operator-controller-manager-7f4549b895-p2vwt\" (UID: \"cfba823f-e85e-42ae-aa8a-7926cc906b92\") " pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.706861 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.739352 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.786793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhn7\" (UniqueName: \"kubernetes.io/projected/cfba823f-e85e-42ae-aa8a-7926cc906b92-kube-api-access-lwhn7\") pod \"test-operator-controller-manager-7f4549b895-p2vwt\" (UID: \"cfba823f-e85e-42ae-aa8a-7926cc906b92\") " pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.789846 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.791106 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.804410 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rhtgx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.814400 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.832741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhn7\" (UniqueName: \"kubernetes.io/projected/cfba823f-e85e-42ae-aa8a-7926cc906b92-kube-api-access-lwhn7\") pod \"test-operator-controller-manager-7f4549b895-p2vwt\" (UID: \"cfba823f-e85e-42ae-aa8a-7926cc906b92\") " pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.848164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.893189 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcl7c\" (UniqueName: \"kubernetes.io/projected/7f740208-043d-4d7f-b533-5526833d10c2-kube-api-access-lcl7c\") pod \"watcher-operator-controller-manager-64cd966744-nhqxg\" (UID: \"7f740208-043d-4d7f-b533-5526833d10c2\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.902483 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.949568 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.950524 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.953393 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lx94s" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.954285 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.954406 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.958953 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.960075 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.961560 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.969337 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.970701 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.972619 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-2wj4c" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.973628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.994987 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995072 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995094 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26sx8\" (UniqueName: \"kubernetes.io/projected/99558a40-3dbc-4c2b-9aab-a085c7ef5c7c-kube-api-access-26sx8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5qgh\" (UID: \"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995148 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcl7c\" (UniqueName: \"kubernetes.io/projected/7f740208-043d-4d7f-b533-5526833d10c2-kube-api-access-lcl7c\") pod \"watcher-operator-controller-manager-64cd966744-nhqxg\" (UID: \"7f740208-043d-4d7f-b533-5526833d10c2\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995255 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995827 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkjx\" (UniqueName: \"kubernetes.io/projected/86d68359-5910-4d1d-8a01-2964f8d26464-kube-api-access-pgkjx\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.027769 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcl7c\" (UniqueName: \"kubernetes.io/projected/7f740208-043d-4d7f-b533-5526833d10c2-kube-api-access-lcl7c\") pod \"watcher-operator-controller-manager-64cd966744-nhqxg\" (UID: \"7f740208-043d-4d7f-b533-5526833d10c2\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.029137 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097433 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097474 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26sx8\" (UniqueName: \"kubernetes.io/projected/99558a40-3dbc-4c2b-9aab-a085c7ef5c7c-kube-api-access-26sx8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5qgh\" (UID: \"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097494 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkjx\" (UniqueName: \"kubernetes.io/projected/86d68359-5910-4d1d-8a01-2964f8d26464-kube-api-access-pgkjx\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.097990 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098035 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:04.098020962 +0000 UTC m=+957.019833986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098308 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098339 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.598329399 +0000 UTC m=+956.520142423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098524 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098551 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.598543594 +0000 UTC m=+956.520356618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.128887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkjx\" (UniqueName: \"kubernetes.io/projected/86d68359-5910-4d1d-8a01-2964f8d26464-kube-api-access-pgkjx\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.136631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26sx8\" (UniqueName: \"kubernetes.io/projected/99558a40-3dbc-4c2b-9aab-a085c7ef5c7c-kube-api-access-26sx8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5qgh\" (UID: \"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.158634 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.168163 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.198058 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.213735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.385392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.407000 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.419609 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac02d392_7ff9_42e1_ad6f_47ab9f04a9a7.slice/crio-46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413 WatchSource:0}: Error finding container 46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413: Status 404 returned error can't find the container with id 46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413 Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.601649 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.608510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.608588 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.608714 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608828 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608897 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:04.608879492 +0000 UTC m=+957.530692516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608828 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608828 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.609100 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:04.608980665 +0000 UTC m=+957.530793689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.609126 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:05.609118368 +0000 UTC m=+958.530931392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.618847 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.619765 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5406b5_d194_441a_a098_7ecdc7831ec1.slice/crio-bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a WatchSource:0}: Error finding container bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a: Status 404 returned error can't find the container with id bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.636363 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-prhbl"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.643588 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.655846 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.745632 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.752041 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.761399 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.765996 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff53e5c0_255a_43c5_a27c_ce9dc3145999.slice/crio-c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d WatchSource:0}: Error finding container c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d: Status 404 returned error can't find the container with id c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.773779 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.780100 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b3e0e3_f4c7_4b3d_9ba0_a198be108cb3.slice/crio-104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415 WatchSource:0}: Error finding container 104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415: Status 404 returned error can't find the container with id 104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415 Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.782386 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2601732b_921a_4c55_821b_0fc994c50236.slice/crio-4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365 WatchSource:0}: Error finding container 4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365: Status 404 returned error can't find the container with id 4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365 Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.782731 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfba823f_e85e_42ae_aa8a_7926cc906b92.slice/crio-ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07 WatchSource:0}: Error finding container ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07: Status 404 returned error can't find the container with id ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07 Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.785082 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xxc76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-26j8t_openstack-operators(2601732b-921a-4c55-821b-0fc994c50236): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.785176 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.246:5001/openstack-k8s-operators/test-operator:d13a5aac38c8137de82b9d4aecf30e64d0d93ea1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lwhn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7f4549b895-p2vwt_openstack-operators(cfba823f-e85e-42ae-aa8a-7926cc906b92): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.787023 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podUID="2601732b-921a-4c55-821b-0fc994c50236" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.787025 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podUID="cfba823f-e85e-42ae-aa8a-7926cc906b92" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.015770 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.023607 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.024006 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" event={"ID":"4604c39e-62d8-4420-b2bc-54d44f4ebcd0","Type":"ContainerStarted","Data":"41b43fade328e74bc89e73feac7662785f6bfd1c4dba192806242b2b446e4d11"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.028480 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.033044 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.036278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" event={"ID":"fb5406b5-d194-441a-a098-7ecdc7831ec1","Type":"ContainerStarted","Data":"bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.038145 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" event={"ID":"ff53e5c0-255a-43c5-a27c-ce9dc3145999","Type":"ContainerStarted","Data":"c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.039383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" event={"ID":"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7","Type":"ContainerStarted","Data":"46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.040950 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.041135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" event={"ID":"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3","Type":"ContainerStarted","Data":"104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.046057 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" event={"ID":"951d4f5c-5d89-41c6-be8a-9828b05ce182","Type":"ContainerStarted","Data":"278725e5154191a674c911b99a316ab6ebdb507f6308f26d7ea0c5b2ad857d67"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.047224 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" event={"ID":"2601732b-921a-4c55-821b-0fc994c50236","Type":"ContainerStarted","Data":"4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365"} Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.048800 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podUID="2601732b-921a-4c55-821b-0fc994c50236" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.049531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" event={"ID":"df2d6d5b-b964-4672-903f-563b7792ee43","Type":"ContainerStarted","Data":"95982d054ce701f7f8592be8a6e0c47177f54e1fc0fab835aa6019e69a82e725"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.050957 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" event={"ID":"b773ecb8-3505-44ad-a28f-bd4054263888","Type":"ContainerStarted","Data":"dad345a1957fa5908f97e40e2038b2fbcc9e75759c47d727d7b071647d056eee"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.060639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" event={"ID":"cfba823f-e85e-42ae-aa8a-7926cc906b92","Type":"ContainerStarted","Data":"ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.062964 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" event={"ID":"d1051db2-8914-422b-a126-5cd8ee078767","Type":"ContainerStarted","Data":"3509bb73a7c494730ebd45c76ce96c5cb00819d68e87a248d553b4ffa82d261d"} Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.063231 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/test-operator:d13a5aac38c8137de82b9d4aecf30e64d0d93ea1\\\"\"" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podUID="cfba823f-e85e-42ae-aa8a-7926cc906b92" Jan 20 18:46:04 crc kubenswrapper[4773]: W0120 18:46:04.065956 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e235ee6_33ad_40e3_9b7a_914820315627.slice/crio-3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9 WatchSource:0}: Error finding container 3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9: Status 404 returned error can't find the container with id 3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9 Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.075001 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lcl7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-nhqxg_openstack-operators(7f740208-043d-4d7f-b533-5526833d10c2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.076222 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podUID="7f740208-043d-4d7f-b533-5526833d10c2" Jan 20 18:46:04 crc kubenswrapper[4773]: W0120 18:46:04.076536 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99558a40_3dbc_4c2b_9aab_a085c7ef5c7c.slice/crio-6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1 WatchSource:0}: Error finding container 6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1: Status 404 returned error can't find the container with id 6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1 Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.078236 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" event={"ID":"a570d5a5-53f4-444f-a14d-92ea24f27e2e","Type":"ContainerStarted","Data":"7f64273351d7647a4d68a3135f9b84cd07b269a4bf2a98ff08b8aceabdb27798"} Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.083326 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26sx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-r5qgh_openstack-operators(99558a40-3dbc-4c2b-9aab-a085c7ef5c7c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.083502 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5bxh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-t8tmg_openstack-operators(9e235ee6-33ad-40e3-9b7a-914820315627): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.092634 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podUID="9e235ee6-33ad-40e3-9b7a-914820315627" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.092789 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podUID="99558a40-3dbc-4c2b-9aab-a085c7ef5c7c" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.096892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" event={"ID":"8f795216-0196-4a5a-bfdf-20dee1543b43","Type":"ContainerStarted","Data":"14c16a8d957ce49a27a61756659da008941d74137df88b9548b27ef4c11d5fc5"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.099129 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" event={"ID":"b196e443-f058-49c2-b54b-a18656415f5a","Type":"ContainerStarted","Data":"25121ab30e93fb5f1c8efdf266ec2c9d6784f7d622e1b4727383e3f887f544e6"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.102061 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" event={"ID":"48aacb32-c120-4f36-898b-60f5d01c5510","Type":"ContainerStarted","Data":"d29312ff4c8c6d40a8552f80ee25b874460ee3d1a8f31baf3a1233e9c9197fef"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.124624 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.124741 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.124804 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:06.124784272 +0000 UTC m=+959.046597296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.630878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.631124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631282 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631347 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:06.63132926 +0000 UTC m=+959.553142284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631352 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631427 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:06.631409932 +0000 UTC m=+959.553222956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.115692 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" event={"ID":"9e235ee6-33ad-40e3-9b7a-914820315627","Type":"ContainerStarted","Data":"3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9"} Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.120772 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podUID="9e235ee6-33ad-40e3-9b7a-914820315627" Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.127103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" event={"ID":"7ed73202-faba-46ba-ae91-8cd9ffbe70a4","Type":"ContainerStarted","Data":"55ab43b01eb56ce4f10f20a1d0e5fd00bdbc0616311b8ee1b864cdbd5b1a70ce"} Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.128389 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" event={"ID":"ed6d3389-b374-42a6-8101-1d34df737170","Type":"ContainerStarted","Data":"babb387495ae3fc73c7f583c1ad3f91c9a44dbbe25711303c9c4b55d12e8c204"} Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.130679 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" event={"ID":"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c","Type":"ContainerStarted","Data":"6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1"} Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.134460 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podUID="99558a40-3dbc-4c2b-9aab-a085c7ef5c7c" Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.139800 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" event={"ID":"7f740208-043d-4d7f-b533-5526833d10c2","Type":"ContainerStarted","Data":"ff84ca1c2d3d9a4637ef8c35d6498288cfbb0db6a6e92e7d20a80a7ac3039598"} Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.141163 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podUID="2601732b-921a-4c55-821b-0fc994c50236" Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.142211 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podUID="7f740208-043d-4d7f-b533-5526833d10c2" Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.155945 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/test-operator:d13a5aac38c8137de82b9d4aecf30e64d0d93ea1\\\"\"" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podUID="cfba823f-e85e-42ae-aa8a-7926cc906b92" Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.647946 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.649112 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.649168 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:09.649151988 +0000 UTC m=+962.570965022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.149823 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podUID="9e235ee6-33ad-40e3-9b7a-914820315627" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.151367 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podUID="99558a40-3dbc-4c2b-9aab-a085c7ef5c7c" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.151534 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podUID="7f740208-043d-4d7f-b533-5526833d10c2" Jan 20 18:46:06 crc kubenswrapper[4773]: I0120 18:46:06.155604 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.155718 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.156134 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:10.156118345 +0000 UTC m=+963.077931389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: I0120 18:46:06.673237 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:06 crc kubenswrapper[4773]: I0120 18:46:06.673616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674189 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674331 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:10.674313109 +0000 UTC m=+963.596126133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674427 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674496 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:10.674481583 +0000 UTC m=+963.596294607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:09 crc kubenswrapper[4773]: I0120 18:46:09.721440 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:09 crc kubenswrapper[4773]: E0120 18:46:09.721637 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:09 crc kubenswrapper[4773]: E0120 18:46:09.722123 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:17.722100409 +0000 UTC m=+970.643913433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: I0120 18:46:10.228364 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.228516 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.228564 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:18.228549604 +0000 UTC m=+971.150362628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: I0120 18:46:10.736214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:10 crc kubenswrapper[4773]: I0120 18:46:10.736349 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736387 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736464 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:18.736443793 +0000 UTC m=+971.658256867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736477 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736521 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:18.736508195 +0000 UTC m=+971.658321219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.738859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.746112 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.875415 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hr4gt" Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.883991 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.246623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.253143 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.288718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-t49mz" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.297599 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.753573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.754649 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.760093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.760662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:19 crc kubenswrapper[4773]: I0120 18:46:19.053966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lx94s" Jan 20 18:46:19 crc kubenswrapper[4773]: I0120 18:46:19.062986 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:21 crc kubenswrapper[4773]: E0120 18:46:21.812423 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843" Jan 20 18:46:21 crc kubenswrapper[4773]: E0120 18:46:21.812912 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ccx9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-2thqw_openstack-operators(7ed73202-faba-46ba-ae91-8cd9ffbe70a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:21 crc kubenswrapper[4773]: E0120 18:46:21.814127 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" podUID="7ed73202-faba-46ba-ae91-8cd9ffbe70a4" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.267198 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" podUID="7ed73202-faba-46ba-ae91-8cd9ffbe70a4" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.310798 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.311047 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cm7vp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-c87fff755-s7scg_openstack-operators(8f795216-0196-4a5a-bfdf-20dee1543b43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.312688 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" podUID="8f795216-0196-4a5a-bfdf-20dee1543b43" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.997900 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.998094 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mf6fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-9b68f5989-xmljc_openstack-operators(48aacb32-c120-4f36-898b-60f5d01c5510): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.999239 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" podUID="48aacb32-c120-4f36-898b-60f5d01c5510" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.273867 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" podUID="8f795216-0196-4a5a-bfdf-20dee1543b43" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.273922 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" podUID="48aacb32-c120-4f36-898b-60f5d01c5510" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.616671 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.616869 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k7cd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-78757b4889-2nhdr_openstack-operators(951d4f5c-5d89-41c6-be8a-9828b05ce182): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.618265 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" podUID="951d4f5c-5d89-41c6-be8a-9828b05ce182" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.277922 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" podUID="951d4f5c-5d89-41c6-be8a-9828b05ce182" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.289213 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.289369 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bqbnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7ddb5c749-hhxlp_openstack-operators(df2d6d5b-b964-4672-903f-563b7792ee43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.290522 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" podUID="df2d6d5b-b964-4672-903f-563b7792ee43" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.882165 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.882318 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbzzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-767fdc4f47-8tsjs_openstack-operators(b773ecb8-3505-44ad-a28f-bd4054263888): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.884056 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" podUID="b773ecb8-3505-44ad-a28f-bd4054263888" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.283344 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" podUID="df2d6d5b-b964-4672-903f-563b7792ee43" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.283390 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" podUID="b773ecb8-3505-44ad-a28f-bd4054263888" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.725496 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.725735 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-65849867d6-prhbl_openstack-operators(fb5406b5-d194-441a-a098-7ecdc7831ec1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.727003 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" podUID="fb5406b5-d194-441a-a098-7ecdc7831ec1" Jan 20 18:46:26 crc kubenswrapper[4773]: E0120 18:46:26.289985 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231\\\"\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" podUID="fb5406b5-d194-441a-a098-7ecdc7831ec1" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.171041 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.171111 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.171156 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.297946 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.298018 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57" gracePeriod=600 Jan 20 18:46:29 crc kubenswrapper[4773]: I0120 18:46:29.310087 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57" exitCode=0 Jan 20 18:46:29 crc kubenswrapper[4773]: I0120 18:46:29.310157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57"} Jan 20 18:46:29 crc kubenswrapper[4773]: I0120 18:46:29.310476 4773 scope.go:117] "RemoveContainer" containerID="714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.079047 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9"] Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.088368 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp"] Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.190159 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r"] Jan 20 18:46:30 crc kubenswrapper[4773]: W0120 18:46:30.224869 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d68359_5910_4d1d_8a01_2964f8d26464.slice/crio-548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a WatchSource:0}: Error finding container 548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a: Status 404 returned error can't find the container with id 548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.324484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" event={"ID":"e9f6d4b3-c2cc-4cc6-b279-362e7439974b","Type":"ContainerStarted","Data":"d56e8b5b597fef7841cadf1d13a05a4c67e6fed5e6ed84c9a6df6b41f14f9026"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.329922 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" event={"ID":"ed6d3389-b374-42a6-8101-1d34df737170","Type":"ContainerStarted","Data":"225ab666fcfe64aff32960e04fa9499c060c2a0be739be8c6d0c67a558ef1133"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.330174 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.334022 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" event={"ID":"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7","Type":"ContainerStarted","Data":"f6f41745989aefdfb6be4b9256e04a5decc4695f79328abd6ac55c693ed8a6ae"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.334106 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.336415 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" event={"ID":"4604c39e-62d8-4420-b2bc-54d44f4ebcd0","Type":"ContainerStarted","Data":"5be1ce2b89dd09127c727aace2e25dc991ca2928376f4a720ec9b132776ab527"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.336524 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.337882 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" event={"ID":"437cadd4-5809-4b9e-afa2-05832cd6c303","Type":"ContainerStarted","Data":"a2783b95c2c86a5c744bdbb311ab64640765bd836f494b66f358c11cf764eaba"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.342756 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" event={"ID":"86d68359-5910-4d1d-8a01-2964f8d26464","Type":"ContainerStarted","Data":"548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.346429 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" event={"ID":"d1051db2-8914-422b-a126-5cd8ee078767","Type":"ContainerStarted","Data":"5fecf636083a07b12876b80526124ebde6af250ad5c582b13271ec117ae015f2"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.346575 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.347893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" event={"ID":"b196e443-f058-49c2-b54b-a18656415f5a","Type":"ContainerStarted","Data":"0b620f20649ee67c6a6e1be9ee57511052c83baef2aab5eed463c75243ed6480"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.348056 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.348994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" event={"ID":"a570d5a5-53f4-444f-a14d-92ea24f27e2e","Type":"ContainerStarted","Data":"1da4e4f867251675e510db96e5840f214b1723ced50a5ab49b04febaaca85c8a"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.349115 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.352117 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" event={"ID":"ff53e5c0-255a-43c5-a27c-ce9dc3145999","Type":"ContainerStarted","Data":"6080865f8e2aa467bd0c9359aa69b2a87efb198c309601fcd851d09717e11418"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.357893 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" podStartSLOduration=6.719818994 podStartE2EDuration="29.357873754s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.063763855 +0000 UTC m=+956.985576929" lastFinishedPulling="2026-01-20 18:46:26.701818665 +0000 UTC m=+979.623631689" observedRunningTime="2026-01-20 18:46:30.354871532 +0000 UTC m=+983.276684556" watchObservedRunningTime="2026-01-20 18:46:30.357873754 +0000 UTC m=+983.279686778" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.416257 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" podStartSLOduration=6.9836091190000005 podStartE2EDuration="29.416240741s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.276008415 +0000 UTC m=+956.197821439" lastFinishedPulling="2026-01-20 18:46:25.708640037 +0000 UTC m=+978.630453061" observedRunningTime="2026-01-20 18:46:30.382667902 +0000 UTC m=+983.304480926" watchObservedRunningTime="2026-01-20 18:46:30.416240741 +0000 UTC m=+983.338053755" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.489076 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" podStartSLOduration=7.399644886 podStartE2EDuration="29.489054336s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.619217157 +0000 UTC m=+956.541030181" lastFinishedPulling="2026-01-20 18:46:25.708626607 +0000 UTC m=+978.630439631" observedRunningTime="2026-01-20 18:46:30.487480139 +0000 UTC m=+983.409293163" watchObservedRunningTime="2026-01-20 18:46:30.489054336 +0000 UTC m=+983.410867360" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.493032 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" podStartSLOduration=6.216964895 podStartE2EDuration="29.493019932s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.424863976 +0000 UTC m=+956.346677000" lastFinishedPulling="2026-01-20 18:46:26.700919023 +0000 UTC m=+979.622732037" observedRunningTime="2026-01-20 18:46:30.415841242 +0000 UTC m=+983.337654266" watchObservedRunningTime="2026-01-20 18:46:30.493019932 +0000 UTC m=+983.414832956" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.521549 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" podStartSLOduration=7.672889035 podStartE2EDuration="29.521529129s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.056131328 +0000 UTC m=+955.977944352" lastFinishedPulling="2026-01-20 18:46:24.904771422 +0000 UTC m=+977.826584446" observedRunningTime="2026-01-20 18:46:30.518042675 +0000 UTC m=+983.439855729" watchObservedRunningTime="2026-01-20 18:46:30.521529129 +0000 UTC m=+983.443342153" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.556981 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" podStartSLOduration=6.988478684 podStartE2EDuration="29.556926722s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.652499487 +0000 UTC m=+956.574312511" lastFinishedPulling="2026-01-20 18:46:26.220947525 +0000 UTC m=+979.142760549" observedRunningTime="2026-01-20 18:46:30.552450244 +0000 UTC m=+983.474263278" watchObservedRunningTime="2026-01-20 18:46:30.556926722 +0000 UTC m=+983.478739756" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.580525 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" podStartSLOduration=6.656496206 podStartE2EDuration="29.5804974s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.77696193 +0000 UTC m=+956.698774954" lastFinishedPulling="2026-01-20 18:46:26.700963124 +0000 UTC m=+979.622776148" observedRunningTime="2026-01-20 18:46:30.575790447 +0000 UTC m=+983.497603481" watchObservedRunningTime="2026-01-20 18:46:30.5804974 +0000 UTC m=+983.502310424" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.360999 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" event={"ID":"cfba823f-e85e-42ae-aa8a-7926cc906b92","Type":"ContainerStarted","Data":"bc13a0347bc52acd4845ee6cc4777bc12ce0bb7f8eadf3b1a0a2a82d784dc290"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.361613 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.363538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" event={"ID":"86d68359-5910-4d1d-8a01-2964f8d26464","Type":"ContainerStarted","Data":"1deca38d6d72d40be14a569795e68e007d3dcdda4e78313d9283207d77f70799"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.363595 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.365325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" event={"ID":"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3","Type":"ContainerStarted","Data":"8334e1984196bd8a090094cad622f282087ee531b33d8876d7af07043fffa4de"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.365759 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.367762 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" event={"ID":"2601732b-921a-4c55-821b-0fc994c50236","Type":"ContainerStarted","Data":"375aba268785c6d71dcab62d1c74b08c423f474ddab1b9a1cb4981db15c5b193"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.367984 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.370208 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" event={"ID":"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c","Type":"ContainerStarted","Data":"0edd9a3f79b3873858cda3701083f1570e467a94838bca4e64e982516bad63ba"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.371879 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" event={"ID":"7f740208-043d-4d7f-b533-5526833d10c2","Type":"ContainerStarted","Data":"b7f8b9cf6e95c1003d0b950cf31f767ec177bad7e0c534fd0918ac61fd6d681b"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.372065 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.374558 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.380506 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" event={"ID":"9e235ee6-33ad-40e3-9b7a-914820315627","Type":"ContainerStarted","Data":"82421a1bdb62a10ce5637d5aa7bee9a1985a6689ac33ef12ae9a27483c3ad306"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.381678 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.383013 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podStartSLOduration=3.4177183429999998 podStartE2EDuration="29.382992932s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.785078733 +0000 UTC m=+956.706891757" lastFinishedPulling="2026-01-20 18:46:29.750353322 +0000 UTC m=+982.672166346" observedRunningTime="2026-01-20 18:46:31.37790796 +0000 UTC m=+984.299720974" watchObservedRunningTime="2026-01-20 18:46:31.382992932 +0000 UTC m=+984.304805976" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.402883 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podStartSLOduration=3.548883316 podStartE2EDuration="29.402860032s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.784904469 +0000 UTC m=+956.706717493" lastFinishedPulling="2026-01-20 18:46:29.638881185 +0000 UTC m=+982.560694209" observedRunningTime="2026-01-20 18:46:31.398514387 +0000 UTC m=+984.320327411" watchObservedRunningTime="2026-01-20 18:46:31.402860032 +0000 UTC m=+984.324673066" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.437593 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" podStartSLOduration=6.998713064 podStartE2EDuration="29.437570467s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.782128233 +0000 UTC m=+956.703941257" lastFinishedPulling="2026-01-20 18:46:26.220985636 +0000 UTC m=+979.142798660" observedRunningTime="2026-01-20 18:46:31.435267653 +0000 UTC m=+984.357080687" watchObservedRunningTime="2026-01-20 18:46:31.437570467 +0000 UTC m=+984.359383491" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.454753 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podStartSLOduration=3.786520021 podStartE2EDuration="29.454737702s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.083188606 +0000 UTC m=+957.005001640" lastFinishedPulling="2026-01-20 18:46:29.751406297 +0000 UTC m=+982.673219321" observedRunningTime="2026-01-20 18:46:31.449594748 +0000 UTC m=+984.371407772" watchObservedRunningTime="2026-01-20 18:46:31.454737702 +0000 UTC m=+984.376550726" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.502525 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podStartSLOduration=3.944802189 podStartE2EDuration="29.502504573s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.074806747 +0000 UTC m=+956.996619771" lastFinishedPulling="2026-01-20 18:46:29.632509131 +0000 UTC m=+982.554322155" observedRunningTime="2026-01-20 18:46:31.502204796 +0000 UTC m=+984.424017820" watchObservedRunningTime="2026-01-20 18:46:31.502504573 +0000 UTC m=+984.424317597" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.531846 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" podStartSLOduration=29.5318283 podStartE2EDuration="29.5318283s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:46:31.530344704 +0000 UTC m=+984.452157748" watchObservedRunningTime="2026-01-20 18:46:31.5318283 +0000 UTC m=+984.453641324" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.569920 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podStartSLOduration=4.021214916 podStartE2EDuration="29.569900967s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.083222396 +0000 UTC m=+957.005035420" lastFinishedPulling="2026-01-20 18:46:29.631908447 +0000 UTC m=+982.553721471" observedRunningTime="2026-01-20 18:46:31.55587837 +0000 UTC m=+984.477691394" watchObservedRunningTime="2026-01-20 18:46:31.569900967 +0000 UTC m=+984.491713991" Jan 20 18:46:32 crc kubenswrapper[4773]: I0120 18:46:32.849109 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.400509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" event={"ID":"437cadd4-5809-4b9e-afa2-05832cd6c303","Type":"ContainerStarted","Data":"6fa4b3f2f0eb9922713b5f5a37d8ae0ce20978ec654c02623263a1dc454e69e8"} Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.401167 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.402200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" event={"ID":"e9f6d4b3-c2cc-4cc6-b279-362e7439974b","Type":"ContainerStarted","Data":"864a35a9d532a1975371bfde3eaffbfa11f32981c9282da45666262860d4a236"} Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.402346 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.419841 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" podStartSLOduration=29.844856372 podStartE2EDuration="33.419823067s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:30.098326919 +0000 UTC m=+983.020139943" lastFinishedPulling="2026-01-20 18:46:33.673293614 +0000 UTC m=+986.595106638" observedRunningTime="2026-01-20 18:46:34.413800252 +0000 UTC m=+987.335613276" watchObservedRunningTime="2026-01-20 18:46:34.419823067 +0000 UTC m=+987.341636091" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.439798 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" podStartSLOduration=28.864817715 podStartE2EDuration="32.439776299s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:30.091470784 +0000 UTC m=+983.013283808" lastFinishedPulling="2026-01-20 18:46:33.666429368 +0000 UTC m=+986.588242392" observedRunningTime="2026-01-20 18:46:34.435060794 +0000 UTC m=+987.356873828" watchObservedRunningTime="2026-01-20 18:46:34.439776299 +0000 UTC m=+987.361589333" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.415732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" event={"ID":"951d4f5c-5d89-41c6-be8a-9828b05ce182","Type":"ContainerStarted","Data":"03e2dc473dc5f09fb2c707e445239f85f4ce48c3b1ee5013945835a1093f07b4"} Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.416514 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.417084 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" event={"ID":"48aacb32-c120-4f36-898b-60f5d01c5510","Type":"ContainerStarted","Data":"c4bdc5bc988df23d572ffe2a4c4fdc33391e7fcea5cde31dbfeb47452fb5d145"} Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.417230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.435088 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" podStartSLOduration=2.973644444 podStartE2EDuration="35.43507479s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.396559395 +0000 UTC m=+956.318372419" lastFinishedPulling="2026-01-20 18:46:35.857989741 +0000 UTC m=+988.779802765" observedRunningTime="2026-01-20 18:46:36.432499108 +0000 UTC m=+989.354312132" watchObservedRunningTime="2026-01-20 18:46:36.43507479 +0000 UTC m=+989.356887814" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.450839 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" podStartSLOduration=2.628659456 podStartE2EDuration="35.450818699s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.085853393 +0000 UTC m=+956.007666417" lastFinishedPulling="2026-01-20 18:46:35.908012636 +0000 UTC m=+988.829825660" observedRunningTime="2026-01-20 18:46:36.447741995 +0000 UTC m=+989.369555039" watchObservedRunningTime="2026-01-20 18:46:36.450818699 +0000 UTC m=+989.372631733" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.305041 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.429127 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" event={"ID":"8f795216-0196-4a5a-bfdf-20dee1543b43","Type":"ContainerStarted","Data":"c6084c26d58eaddd5ce92798e6ac9ed4fbe002dead88e2e8f0194fe16a248954"} Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.429281 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.431144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" event={"ID":"7ed73202-faba-46ba-ae91-8cd9ffbe70a4","Type":"ContainerStarted","Data":"ad668968f8e3a57e062edbde61a9e295d5400b9f8de38d07f993ea4f0306400b"} Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.431359 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.451555 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" podStartSLOduration=3.041617408 podStartE2EDuration="37.451539362s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.64922937 +0000 UTC m=+956.571042394" lastFinishedPulling="2026-01-20 18:46:38.059151324 +0000 UTC m=+990.980964348" observedRunningTime="2026-01-20 18:46:38.445776463 +0000 UTC m=+991.367589507" watchObservedRunningTime="2026-01-20 18:46:38.451539362 +0000 UTC m=+991.373352386" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.462885 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" podStartSLOduration=2.550823273 podStartE2EDuration="36.462870895s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.041761912 +0000 UTC m=+956.963574936" lastFinishedPulling="2026-01-20 18:46:37.953809534 +0000 UTC m=+990.875622558" observedRunningTime="2026-01-20 18:46:38.461812879 +0000 UTC m=+991.383625893" watchObservedRunningTime="2026-01-20 18:46:38.462870895 +0000 UTC m=+991.384683909" Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.069123 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.441611 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" event={"ID":"b773ecb8-3505-44ad-a28f-bd4054263888","Type":"ContainerStarted","Data":"70d604bf9ca953013dee36c6a8d67d0164744056953dcfcc1b8e7a8abb489b53"} Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.442881 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.474551 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" podStartSLOduration=3.206588806 podStartE2EDuration="38.474535009s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.619095765 +0000 UTC m=+956.540908789" lastFinishedPulling="2026-01-20 18:46:38.887041968 +0000 UTC m=+991.808854992" observedRunningTime="2026-01-20 18:46:39.473199306 +0000 UTC m=+992.395012330" watchObservedRunningTime="2026-01-20 18:46:39.474535009 +0000 UTC m=+992.396348033" Jan 20 18:46:41 crc kubenswrapper[4773]: I0120 18:46:41.461018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" event={"ID":"df2d6d5b-b964-4672-903f-563b7792ee43","Type":"ContainerStarted","Data":"a3a3434b0caf975b7fcb78687b73fe7fee584419559774a3ebb782d782923b81"} Jan 20 18:46:41 crc kubenswrapper[4773]: I0120 18:46:41.462077 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:41 crc kubenswrapper[4773]: I0120 18:46:41.487781 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" podStartSLOduration=2.799116823 podStartE2EDuration="40.487759402s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.221997483 +0000 UTC m=+956.143810507" lastFinishedPulling="2026-01-20 18:46:40.910640062 +0000 UTC m=+993.832453086" observedRunningTime="2026-01-20 18:46:41.484498823 +0000 UTC m=+994.406311847" watchObservedRunningTime="2026-01-20 18:46:41.487759402 +0000 UTC m=+994.409572426" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.026395 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.056460 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.122790 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.171109 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.297109 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.403598 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.463830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" event={"ID":"fb5406b5-d194-441a-a098-7ecdc7831ec1","Type":"ContainerStarted","Data":"84f6d7fcfe0f1cb518d89aae5d096c9c0c34986193c2d2763244de1d4b1b9b16"} Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.464068 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.481913 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" podStartSLOduration=3.202728939 podStartE2EDuration="41.481896833s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.626724116 +0000 UTC m=+956.548537140" lastFinishedPulling="2026-01-20 18:46:41.90589201 +0000 UTC m=+994.827705034" observedRunningTime="2026-01-20 18:46:42.479898985 +0000 UTC m=+995.401712009" watchObservedRunningTime="2026-01-20 18:46:42.481896833 +0000 UTC m=+995.403709857" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.502956 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.533658 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.647111 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.711439 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.852634 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.962739 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.964037 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:43 crc kubenswrapper[4773]: I0120 18:46:43.161297 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:47 crc kubenswrapper[4773]: I0120 18:46:47.889906 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.040067 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.318526 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.503722 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.550114 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.906787 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.610111 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.639198 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.639657 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.645982 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.646388 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.646721 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.646862 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qw675" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.747410 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.748500 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.754258 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.756688 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.756807 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.769800 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857700 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857770 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.858167 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.859056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.882620 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.958967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.959338 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.959446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.960430 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.960460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.985258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.010193 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.072099 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.451511 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.463525 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.530393 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:08 crc kubenswrapper[4773]: W0120 18:47:08.534395 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dfd387a_7b46_44b7_8aed_53e919c99903.slice/crio-b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49 WatchSource:0}: Error finding container b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49: Status 404 returned error can't find the container with id b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49 Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.667495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" event={"ID":"2dfd387a-7b46-44b7-8aed-53e919c99903","Type":"ContainerStarted","Data":"b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49"} Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.668866 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" event={"ID":"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4","Type":"ContainerStarted","Data":"6668d8f915cf882b03ecb7f9c9321d7df5e2547abe73d4876b6b1e6b3be6ef5b"} Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.424200 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.452846 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.454010 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.473253 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.596715 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.596785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.597046 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.698466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.698854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.698912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.699603 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.699781 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.722218 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.728319 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.776983 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.777392 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.778554 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.793496 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.902593 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.902646 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.902752 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.003509 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.003616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.003641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.004523 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.005102 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.023468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.096512 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.348978 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.355191 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:11 crc kubenswrapper[4773]: W0120 18:47:11.357108 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda00a537a_172f_4ec7_9573_dd9ac2f347e3.slice/crio-c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf WatchSource:0}: Error finding container c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf: Status 404 returned error can't find the container with id c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf Jan 20 18:47:11 crc kubenswrapper[4773]: W0120 18:47:11.360454 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22a48624_e6b5_4225_baf3_c05ff3bed80d.slice/crio-a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268 WatchSource:0}: Error finding container a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268: Status 404 returned error can't find the container with id a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268 Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.690781 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" event={"ID":"22a48624-e6b5-4225-baf3-c05ff3bed80d","Type":"ContainerStarted","Data":"a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268"} Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.692791 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" event={"ID":"a00a537a-172f-4ec7-9573-dd9ac2f347e3","Type":"ContainerStarted","Data":"c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf"} Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.282920 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.284579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.287365 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.287491 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.288764 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.289884 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.293328 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pbqbk" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.295606 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.295796 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.296245 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.303787 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307574 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307659 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6z6h4" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307588 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307986 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.308120 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307964 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.320438 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.330026 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.427899 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.427979 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428026 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428322 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428678 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428730 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428943 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429105 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429175 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429491 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429597 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429758 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429796 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.532086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.532138 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.532477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533017 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533103 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533123 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533569 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533608 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533645 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533663 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533680 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533708 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533728 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533750 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533765 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533813 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533862 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534052 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534072 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534351 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534602 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534780 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.535638 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.535766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.536217 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.536534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.536841 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537290 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537302 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537367 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.543048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.543583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.551057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.552210 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.552852 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.556676 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.558147 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.566294 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.579656 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.583135 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.593324 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.614721 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.626073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.988392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.054560 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.056466 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.061890 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.062923 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lh9jh" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.065604 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.065782 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.070365 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.072061 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144269 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144333 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfm7p\" (UniqueName: \"kubernetes.io/projected/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kube-api-access-hfm7p\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144699 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144721 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144782 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246218 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfm7p\" (UniqueName: \"kubernetes.io/projected/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kube-api-access-hfm7p\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246372 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246395 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246421 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.248189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.248204 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.249893 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.250198 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.250918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.252108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.257302 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.258361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.272132 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfm7p\" (UniqueName: \"kubernetes.io/projected/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kube-api-access-hfm7p\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.287346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.334412 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.376385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.709716 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerStarted","Data":"81b5a2b92f1105f5c420453ca19111fe1ca35ac9507a3ac978f1c848d16b5b05"} Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.717863 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerStarted","Data":"9437201a24daa22de36ef5e4cb32d33d9216523028488aa287392d8e49c9e78c"} Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.004655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: W0120 18:47:14.098350 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b243ca_6da3_4247_a1fe_2ea3e5be80cc.slice/crio-e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f WatchSource:0}: Error finding container e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f: Status 404 returned error can't find the container with id e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.533171 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.534895 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.536977 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-88msx" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.537741 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.537968 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.540988 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.558183 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683698 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683773 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683858 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.684030 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.684055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bqnc\" (UniqueName: \"kubernetes.io/projected/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kube-api-access-6bqnc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.684081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.748534 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerStarted","Data":"e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f"} Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.785535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.785585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.786678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.786809 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788131 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqnc\" (UniqueName: \"kubernetes.io/projected/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kube-api-access-6bqnc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788549 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.789296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.797830 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.805306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.808863 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqnc\" (UniqueName: \"kubernetes.io/projected/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kube-api-access-6bqnc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.823757 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.867496 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.953396 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.954682 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.956593 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dzq72" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.957360 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.957502 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.969672 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100214 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-config-data\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100287 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8mf\" (UniqueName: \"kubernetes.io/projected/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kube-api-access-bb8mf\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100311 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kolla-config\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202742 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-config-data\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202830 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8mf\" (UniqueName: \"kubernetes.io/projected/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kube-api-access-bb8mf\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kolla-config\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.204154 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kolla-config\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.204706 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-config-data\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.209723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.210498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.228886 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8mf\" (UniqueName: \"kubernetes.io/projected/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kube-api-access-bb8mf\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.300752 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.571748 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.605391 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.766586 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerStarted","Data":"0ee5fe3b9e18aaafc48facf4791c76cfe84529f11deecc1670a22baf6625aa11"} Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.769227 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4","Type":"ContainerStarted","Data":"ad3a696e356892a03596bf88ca7536ddf3794e1fc34c0c01572bf4ffbb7eeda6"} Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.618247 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.619153 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.621948 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zrhgh" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.639192 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.741483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"kube-state-metrics-0\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.842534 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"kube-state-metrics-0\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.867550 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"kube-state-metrics-0\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.937042 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:47:17 crc kubenswrapper[4773]: I0120 18:47:17.537197 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:47:17 crc kubenswrapper[4773]: I0120 18:47:17.794860 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerStarted","Data":"381de174237c80b24a95594eb30259e92e84f6fa102ffa5688eefcf07e0ea711"} Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.066195 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.068614 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.070910 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.071348 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.071587 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.071917 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.074942 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.076218 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7pgtj" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.251595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252068 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252180 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfllw\" (UniqueName: \"kubernetes.io/projected/5818e5c4-9a2c-453f-b158-f4be5ec40619-kube-api-access-hfllw\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252242 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252266 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-config\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252375 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.353967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354123 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354332 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfllw\" (UniqueName: \"kubernetes.io/projected/5818e5c4-9a2c-453f-b158-f4be5ec40619-kube-api-access-hfllw\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354471 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-config\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354647 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.355387 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.355793 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.355980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-config\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.362158 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.363247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.363648 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.371571 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfllw\" (UniqueName: \"kubernetes.io/projected/5818e5c4-9a2c-453f-b158-f4be5ec40619-kube-api-access-hfllw\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.374541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.438055 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.826326 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t5h8j"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.829484 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.831922 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.832647 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.836173 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-55crm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.858865 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5gcvm"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.860449 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.867234 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.872110 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5gcvm"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-run\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7724g\" (UniqueName: \"kubernetes.io/projected/bada64ed-c7da-4bd9-9195-75bbdcdd0406-kube-api-access-7724g\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-ovn-controller-tls-certs\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968270 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-log-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968355 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-lib\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968388 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-combined-ca-bundle\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968431 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968449 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-log\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d45m\" (UniqueName: \"kubernetes.io/projected/2fce4eb9-f614-4050-a099-0a743695dcd9-kube-api-access-5d45m\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968550 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bada64ed-c7da-4bd9-9195-75bbdcdd0406-scripts\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fce4eb9-f614-4050-a099-0a743695dcd9-scripts\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968598 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-etc-ovs\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-lib\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070242 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-combined-ca-bundle\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-log\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070353 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d45m\" (UniqueName: \"kubernetes.io/projected/2fce4eb9-f614-4050-a099-0a743695dcd9-kube-api-access-5d45m\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bada64ed-c7da-4bd9-9195-75bbdcdd0406-scripts\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fce4eb9-f614-4050-a099-0a743695dcd9-scripts\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070440 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-etc-ovs\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070511 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-run\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7724g\" (UniqueName: \"kubernetes.io/projected/bada64ed-c7da-4bd9-9195-75bbdcdd0406-kube-api-access-7724g\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070563 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-ovn-controller-tls-certs\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-log-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-lib\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070844 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-log-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.071025 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-etc-ovs\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.071046 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-log\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.071252 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-run\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.075194 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bada64ed-c7da-4bd9-9195-75bbdcdd0406-scripts\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.078956 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fce4eb9-f614-4050-a099-0a743695dcd9-scripts\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.079697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-ovn-controller-tls-certs\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.087268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-combined-ca-bundle\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.088583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7724g\" (UniqueName: \"kubernetes.io/projected/bada64ed-c7da-4bd9-9195-75bbdcdd0406-kube-api-access-7724g\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.091481 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d45m\" (UniqueName: \"kubernetes.io/projected/2fce4eb9-f614-4050-a099-0a743695dcd9-kube-api-access-5d45m\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.151034 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.174550 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.319981 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.321312 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.324508 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.324708 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.324999 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-j9zcb" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.325131 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.354554 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433244 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433362 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433454 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433486 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvf5\" (UniqueName: \"kubernetes.io/projected/4c900f03-61d3-470c-9803-3f6b617ddf0a-kube-api-access-tsvf5\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433540 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535196 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535247 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535294 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvf5\" (UniqueName: \"kubernetes.io/projected/4c900f03-61d3-470c-9803-3f6b617ddf0a-kube-api-access-tsvf5\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535332 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535435 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535945 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.536380 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.536547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.536829 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.543202 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.554892 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvf5\" (UniqueName: \"kubernetes.io/projected/4c900f03-61d3-470c-9803-3f6b617ddf0a-kube-api-access-tsvf5\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.558548 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.559837 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.566370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.640057 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.814740 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.815473 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6bqnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(bfe9133c-0d58-4877-97ee-5b0abeee1a95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.817363 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="bfe9133c-0d58-4877-97ee-5b0abeee1a95" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.982807 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="bfe9133c-0d58-4877-97ee-5b0abeee1a95" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.647538 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.647994 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nbhfh5ddh5cdhf8h687h67bh5b8h5cbh58dh4h78hd8hddh5c4h585h64bhcdhbh64dh559hb7h69h5ddh9h57ch5h75h5fch66fh8ch5ddq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bb8mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(cb8cda87-65c5-4be7-9891-b82bcfc8e0d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.649237 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="cb8cda87-65c5-4be7-9891-b82bcfc8e0d4" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.993291 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="cb8cda87-65c5-4be7-9891-b82bcfc8e0d4" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.599012 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.599381 4773 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.599527 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jjqgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.600700 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.015461 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.426736 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.427123 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tpc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2xgbd_openstack(e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.428635 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" podUID="e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.435466 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.435602 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5ldb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-tz8rp_openstack(a00a537a-172f-4ec7-9573-dd9ac2f347e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.437207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" podUID="a00a537a-172f-4ec7-9573-dd9ac2f347e3" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.497798 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.498211 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5rt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zb8x4_openstack(2dfd387a-7b46-44b7-8aed-53e919c99903): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.500449 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" podUID="2dfd387a-7b46-44b7-8aed-53e919c99903" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.536099 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.536277 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qpxj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-75zzb_openstack(22a48624-e6b5-4225-baf3-c05ff3bed80d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.537457 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" podUID="22a48624-e6b5-4225-baf3-c05ff3bed80d" Jan 20 18:47:46 crc kubenswrapper[4773]: I0120 18:47:46.921878 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 20 18:47:46 crc kubenswrapper[4773]: I0120 18:47:46.976107 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j"] Jan 20 18:47:46 crc kubenswrapper[4773]: W0120 18:47:46.981110 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fce4eb9_f614_4050_a099_0a743695dcd9.slice/crio-f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011 WatchSource:0}: Error finding container f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011: Status 404 returned error can't find the container with id f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011 Jan 20 18:47:47 crc kubenswrapper[4773]: W0120 18:47:47.020480 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c900f03_61d3_470c_9803_3f6b617ddf0a.slice/crio-ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6 WatchSource:0}: Error finding container ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6: Status 404 returned error can't find the container with id ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6 Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.020568 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerStarted","Data":"90364fdeac7c303550cf233c55536db126c3f1b96739db6cca5f5305ac5bd779"} Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.020657 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.022654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j" event={"ID":"2fce4eb9-f614-4050-a099-0a743695dcd9","Type":"ContainerStarted","Data":"f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011"} Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.023874 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5818e5c4-9a2c-453f-b158-f4be5ec40619","Type":"ContainerStarted","Data":"397907a0dddb31a8de7ba8daa66a368708d05cd12bb1a2803808592054be6bda"} Jan 20 18:47:47 crc kubenswrapper[4773]: E0120 18:47:47.026522 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" podUID="22a48624-e6b5-4225-baf3-c05ff3bed80d" Jan 20 18:47:47 crc kubenswrapper[4773]: E0120 18:47:47.026523 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" podUID="a00a537a-172f-4ec7-9573-dd9ac2f347e3" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.388386 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.390028 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487814 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"2dfd387a-7b46-44b7-8aed-53e919c99903\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487871 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487950 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"2dfd387a-7b46-44b7-8aed-53e919c99903\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487972 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"2dfd387a-7b46-44b7-8aed-53e919c99903\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.488142 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.488544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config" (OuterVolumeSpecName: "config") pod "2dfd387a-7b46-44b7-8aed-53e919c99903" (UID: "2dfd387a-7b46-44b7-8aed-53e919c99903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.488817 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.493445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config" (OuterVolumeSpecName: "config") pod "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" (UID: "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.493836 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2dfd387a-7b46-44b7-8aed-53e919c99903" (UID: "2dfd387a-7b46-44b7-8aed-53e919c99903"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.495504 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2" (OuterVolumeSpecName: "kube-api-access-n5rt2") pod "2dfd387a-7b46-44b7-8aed-53e919c99903" (UID: "2dfd387a-7b46-44b7-8aed-53e919c99903"). InnerVolumeSpecName "kube-api-access-n5rt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.513208 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9" (OuterVolumeSpecName: "kube-api-access-9tpc9") pod "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" (UID: "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4"). InnerVolumeSpecName "kube-api-access-9tpc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.590844 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.590880 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.591099 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.591108 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.595995 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5gcvm"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.031960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" event={"ID":"2dfd387a-7b46-44b7-8aed-53e919c99903","Type":"ContainerDied","Data":"b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.032058 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.035987 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4c900f03-61d3-470c-9803-3f6b617ddf0a","Type":"ContainerStarted","Data":"ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.037960 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.037959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" event={"ID":"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4","Type":"ContainerDied","Data":"6668d8f915cf882b03ecb7f9c9321d7df5e2547abe73d4876b6b1e6b3be6ef5b"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.039167 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"ce5f6226d2493b30a9b67af06562b91019b4cb76229060cbb6768f07476f19ed"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.040476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerStarted","Data":"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.043013 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerStarted","Data":"582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.077695 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.088345 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.149860 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.152583 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:49 crc kubenswrapper[4773]: I0120 18:47:49.456292 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfd387a-7b46-44b7-8aed-53e919c99903" path="/var/lib/kubelet/pods/2dfd387a-7b46-44b7-8aed-53e919c99903/volumes" Jan 20 18:47:49 crc kubenswrapper[4773]: I0120 18:47:49.457080 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" path="/var/lib/kubelet/pods/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4/volumes" Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.062500 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5818e5c4-9a2c-453f-b158-f4be5ec40619","Type":"ContainerStarted","Data":"7148ef81189f7da6562d890081d09f6e1a6d847f1f4ff4ecfef97131174ad0b7"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.064413 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4c900f03-61d3-470c-9803-3f6b617ddf0a","Type":"ContainerStarted","Data":"91e2e3095c5bf9a2f616b1fb8993a836773ec0298326099365b899aaf3e3c453"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.065616 4773 generic.go:334] "Generic (PLEG): container finished" podID="11b243ca-6da3-4247-a1fe-2ea3e5be80cc" containerID="90364fdeac7c303550cf233c55536db126c3f1b96739db6cca5f5305ac5bd779" exitCode=0 Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.065692 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerDied","Data":"90364fdeac7c303550cf233c55536db126c3f1b96739db6cca5f5305ac5bd779"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.067209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"140ae6545b5775f299a982031c8f08e25a9cdd7e9f70deaca79b00829df20194"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.068832 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j" event={"ID":"2fce4eb9-f614-4050-a099-0a743695dcd9","Type":"ContainerStarted","Data":"b8bd9c6cf013143eb926554716c1d42906fabc0b03ee57b125a562e3cdbdaf9e"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.068985 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.105493 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-t5h8j" podStartSLOduration=27.629158474 podStartE2EDuration="31.105475441s" podCreationTimestamp="2026-01-20 18:47:20 +0000 UTC" firstStartedPulling="2026-01-20 18:47:46.983800149 +0000 UTC m=+1059.905613173" lastFinishedPulling="2026-01-20 18:47:50.460117116 +0000 UTC m=+1063.381930140" observedRunningTime="2026-01-20 18:47:51.101486395 +0000 UTC m=+1064.023299439" watchObservedRunningTime="2026-01-20 18:47:51.105475441 +0000 UTC m=+1064.027288465" Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.078186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerStarted","Data":"44afb65ac51ffbb0bfa32da6f84f67ac17b0880684349e0fd30efcf6c3037e58"} Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.081589 4773 generic.go:334] "Generic (PLEG): container finished" podID="bada64ed-c7da-4bd9-9195-75bbdcdd0406" containerID="140ae6545b5775f299a982031c8f08e25a9cdd7e9f70deaca79b00829df20194" exitCode=0 Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.082117 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerDied","Data":"140ae6545b5775f299a982031c8f08e25a9cdd7e9f70deaca79b00829df20194"} Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.125525 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.603393186 podStartE2EDuration="40.125503276s" podCreationTimestamp="2026-01-20 18:47:12 +0000 UTC" firstStartedPulling="2026-01-20 18:47:14.120923553 +0000 UTC m=+1027.042736577" lastFinishedPulling="2026-01-20 18:47:45.643033643 +0000 UTC m=+1058.564846667" observedRunningTime="2026-01-20 18:47:52.116498319 +0000 UTC m=+1065.038311343" watchObservedRunningTime="2026-01-20 18:47:52.125503276 +0000 UTC m=+1065.047316300" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.092107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"76dd5a962ff426a3eb55fbf08ddd3d739fe172ac05a461758797c6451644b570"} Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.092472 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"4d959497ec0bb96d17059210dfd4779cc5b4ce6b9afea2b889b9466c137577b3"} Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.093554 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.093582 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.377141 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.377197 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.103814 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5818e5c4-9a2c-453f-b158-f4be5ec40619","Type":"ContainerStarted","Data":"6f73c3796bc90e5194cd970ada8705deecc5d419dd1c8d59789f7a38452be4a2"} Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.108655 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4c900f03-61d3-470c-9803-3f6b617ddf0a","Type":"ContainerStarted","Data":"9e2d0dc0ea50be4df20c0569611b3a134aca609e02a64146777d39153fa055de"} Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.136264 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=28.212972242 podStartE2EDuration="35.13624095s" podCreationTimestamp="2026-01-20 18:47:19 +0000 UTC" firstStartedPulling="2026-01-20 18:47:46.940091105 +0000 UTC m=+1059.861904129" lastFinishedPulling="2026-01-20 18:47:53.863359813 +0000 UTC m=+1066.785172837" observedRunningTime="2026-01-20 18:47:54.122903428 +0000 UTC m=+1067.044716462" watchObservedRunningTime="2026-01-20 18:47:54.13624095 +0000 UTC m=+1067.058053984" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.136415 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5gcvm" podStartSLOduration=31.278450309 podStartE2EDuration="34.136409784s" podCreationTimestamp="2026-01-20 18:47:20 +0000 UTC" firstStartedPulling="2026-01-20 18:47:47.602312796 +0000 UTC m=+1060.524125820" lastFinishedPulling="2026-01-20 18:47:50.460272261 +0000 UTC m=+1063.382085295" observedRunningTime="2026-01-20 18:47:53.116092642 +0000 UTC m=+1066.037905666" watchObservedRunningTime="2026-01-20 18:47:54.136409784 +0000 UTC m=+1067.058222818" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.153159 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.297612373 podStartE2EDuration="31.153142437s" podCreationTimestamp="2026-01-20 18:47:23 +0000 UTC" firstStartedPulling="2026-01-20 18:47:47.022855041 +0000 UTC m=+1059.944668065" lastFinishedPulling="2026-01-20 18:47:53.878385105 +0000 UTC m=+1066.800198129" observedRunningTime="2026-01-20 18:47:54.1462271 +0000 UTC m=+1067.068040124" watchObservedRunningTime="2026-01-20 18:47:54.153142437 +0000 UTC m=+1067.074955461" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.640092 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.640145 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.681616 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.159152 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.419818 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.439158 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.478767 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.480189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.482006 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.488908 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xs9zd"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.489842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.494174 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.505341 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.516557 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xs9zd"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.664624 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681161 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681232 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovs-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681388 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-config\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681446 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovn-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-combined-ca-bundle\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681645 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681763 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7kxx\" (UniqueName: \"kubernetes.io/projected/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-kube-api-access-h7kxx\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.695552 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.716405 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.716638 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.723714 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.782946 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-combined-ca-bundle\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783006 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783091 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7kxx\" (UniqueName: \"kubernetes.io/projected/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-kube-api-access-h7kxx\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783174 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovs-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-config\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783294 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovn-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783595 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovn-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.784598 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovs-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.784604 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.784723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.785050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-config\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.785070 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.788981 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.789031 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-combined-ca-bundle\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.802904 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.808480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.812700 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7kxx\" (UniqueName: \"kubernetes.io/projected/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-kube-api-access-h7kxx\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.820973 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885480 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.922303 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.969157 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987030 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987103 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987128 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.990612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.990978 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.991084 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.991240 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.006146 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.036910 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.087865 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"22a48624-e6b5-4225-baf3-c05ff3bed80d\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088033 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088069 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088159 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088208 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"22a48624-e6b5-4225-baf3-c05ff3bed80d\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088229 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"22a48624-e6b5-4225-baf3-c05ff3bed80d\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088642 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config" (OuterVolumeSpecName: "config") pod "a00a537a-172f-4ec7-9573-dd9ac2f347e3" (UID: "a00a537a-172f-4ec7-9573-dd9ac2f347e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088664 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config" (OuterVolumeSpecName: "config") pod "22a48624-e6b5-4225-baf3-c05ff3bed80d" (UID: "22a48624-e6b5-4225-baf3-c05ff3bed80d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088785 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a00a537a-172f-4ec7-9573-dd9ac2f347e3" (UID: "a00a537a-172f-4ec7-9573-dd9ac2f347e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.089143 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22a48624-e6b5-4225-baf3-c05ff3bed80d" (UID: "22a48624-e6b5-4225-baf3-c05ff3bed80d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.091413 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5" (OuterVolumeSpecName: "kube-api-access-qpxj5") pod "22a48624-e6b5-4225-baf3-c05ff3bed80d" (UID: "22a48624-e6b5-4225-baf3-c05ff3bed80d"). InnerVolumeSpecName "kube-api-access-qpxj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.096282 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb" (OuterVolumeSpecName: "kube-api-access-d5ldb") pod "a00a537a-172f-4ec7-9573-dd9ac2f347e3" (UID: "a00a537a-172f-4ec7-9573-dd9ac2f347e3"). InnerVolumeSpecName "kube-api-access-d5ldb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.126175 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" event={"ID":"22a48624-e6b5-4225-baf3-c05ff3bed80d","Type":"ContainerDied","Data":"a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268"} Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.126271 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.131213 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.131363 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" event={"ID":"a00a537a-172f-4ec7-9573-dd9ac2f347e3","Type":"ContainerDied","Data":"c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf"} Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189669 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189702 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189714 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189724 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189736 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189747 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.206998 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.217526 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.241535 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.249861 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.279482 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:47:56 crc kubenswrapper[4773]: W0120 18:47:56.367376 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ceb1c5_1dbc_4810_95c9_c1ac0b915542.slice/crio-68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0 WatchSource:0}: Error finding container 68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0: Status 404 returned error can't find the container with id 68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0 Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.375092 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xs9zd"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.439021 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.517640 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.543119 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.152498 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerStarted","Data":"bb1a743b0011c8852a174782d50865209e340613e556f18abff865390cc51c2b"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.155546 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4","Type":"ContainerStarted","Data":"fd5c88904eb01081abb48040c0407ee8e8b2891234254b33e7cfd8a35fd7f534"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.157098 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.165858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerStarted","Data":"ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.166157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerStarted","Data":"4264d71b8e07e5eb9a8f6822d3dd22ead9577270d589def4baeb9a9c2e4760f2"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.172251 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerStarted","Data":"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.172308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerStarted","Data":"0ac79f3c39c06ac9f58156f40c2aaee8557c1dbe4bc8966a0b68c584c793d8d2"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.176352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xs9zd" event={"ID":"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542","Type":"ContainerStarted","Data":"c5b27e44cc2587a9ad27aa31efc4210c595b01c54234dff6e697ee360829be1d"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.176386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xs9zd" event={"ID":"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542","Type":"ContainerStarted","Data":"68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.207473 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.385477634 podStartE2EDuration="43.207456603s" podCreationTimestamp="2026-01-20 18:47:14 +0000 UTC" firstStartedPulling="2026-01-20 18:47:15.639781241 +0000 UTC m=+1028.561594275" lastFinishedPulling="2026-01-20 18:47:56.46176022 +0000 UTC m=+1069.383573244" observedRunningTime="2026-01-20 18:47:57.204387689 +0000 UTC m=+1070.126200733" watchObservedRunningTime="2026-01-20 18:47:57.207456603 +0000 UTC m=+1070.129269627" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.237558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.250162 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xs9zd" podStartSLOduration=2.250141862 podStartE2EDuration="2.250141862s" podCreationTimestamp="2026-01-20 18:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:47:57.226507623 +0000 UTC m=+1070.148320647" watchObservedRunningTime="2026-01-20 18:47:57.250141862 +0000 UTC m=+1070.171954896" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.459143 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a48624-e6b5-4225-baf3-c05ff3bed80d" path="/var/lib/kubelet/pods/22a48624-e6b5-4225-baf3-c05ff3bed80d/volumes" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.459713 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00a537a-172f-4ec7-9573-dd9ac2f347e3" path="/var/lib/kubelet/pods/a00a537a-172f-4ec7-9573-dd9ac2f347e3/volumes" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.522348 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.523577 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.525547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.525914 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7w57g" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.526097 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.527397 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.539262 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625560 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-config\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625724 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drmfz\" (UniqueName: \"kubernetes.io/projected/152ecb39-d580-4c8d-b572-e3a6bb070c7f-kube-api-access-drmfz\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.626038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-scripts\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727656 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727724 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-config\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727815 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727842 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727858 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drmfz\" (UniqueName: \"kubernetes.io/projected/152ecb39-d580-4c8d-b572-e3a6bb070c7f-kube-api-access-drmfz\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-scripts\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.728686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-scripts\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.728750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.728782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-config\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.733074 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.733182 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.738662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.749392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drmfz\" (UniqueName: \"kubernetes.io/projected/152ecb39-d580-4c8d-b572-e3a6bb070c7f-kube-api-access-drmfz\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.869495 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.186798 4773 generic.go:334] "Generic (PLEG): container finished" podID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerID="ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f" exitCode=0 Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.187107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerDied","Data":"ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.187201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerStarted","Data":"ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.187282 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.189603 4773 generic.go:334] "Generic (PLEG): container finished" podID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" exitCode=0 Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.189674 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerDied","Data":"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.189734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerStarted","Data":"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.190860 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.212119 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" podStartSLOduration=2.7823644610000002 podStartE2EDuration="3.212103228s" podCreationTimestamp="2026-01-20 18:47:55 +0000 UTC" firstStartedPulling="2026-01-20 18:47:56.556553436 +0000 UTC m=+1069.478366470" lastFinishedPulling="2026-01-20 18:47:56.986292223 +0000 UTC m=+1069.908105237" observedRunningTime="2026-01-20 18:47:58.209187938 +0000 UTC m=+1071.131000962" watchObservedRunningTime="2026-01-20 18:47:58.212103228 +0000 UTC m=+1071.133916242" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.234597 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" podStartSLOduration=2.605404284 podStartE2EDuration="3.234579149s" podCreationTimestamp="2026-01-20 18:47:55 +0000 UTC" firstStartedPulling="2026-01-20 18:47:56.291322162 +0000 UTC m=+1069.213135186" lastFinishedPulling="2026-01-20 18:47:56.920497027 +0000 UTC m=+1069.842310051" observedRunningTime="2026-01-20 18:47:58.231437584 +0000 UTC m=+1071.153250608" watchObservedRunningTime="2026-01-20 18:47:58.234579149 +0000 UTC m=+1071.156392173" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.396184 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 20 18:47:59 crc kubenswrapper[4773]: I0120 18:47:59.197766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"152ecb39-d580-4c8d-b572-e3a6bb070c7f","Type":"ContainerStarted","Data":"38ea9378adbfaecb468fa8a1b251573fd8651ef7cc4484900b8aebbd8dc7d654"} Jan 20 18:47:59 crc kubenswrapper[4773]: I0120 18:47:59.476692 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 20 18:47:59 crc kubenswrapper[4773]: I0120 18:47:59.565558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.214691 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerStarted","Data":"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.215252 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.216073 4773 generic.go:334] "Generic (PLEG): container finished" podID="bfe9133c-0d58-4877-97ee-5b0abeee1a95" containerID="bb1a743b0011c8852a174782d50865209e340613e556f18abff865390cc51c2b" exitCode=0 Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.216144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerDied","Data":"bb1a743b0011c8852a174782d50865209e340613e556f18abff865390cc51c2b"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.220742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"152ecb39-d580-4c8d-b572-e3a6bb070c7f","Type":"ContainerStarted","Data":"e1fe5e0dd3e927154800bb7ab92dd110d34928df22cd240619a05e378f41b981"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.220768 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.220779 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"152ecb39-d580-4c8d-b572-e3a6bb070c7f","Type":"ContainerStarted","Data":"a9882d23a0b277ed94eaad4fc602a61a222cefd16c506c6f53e6a6e88ed8b7d3"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.236568 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.622350898 podStartE2EDuration="44.236553832s" podCreationTimestamp="2026-01-20 18:47:16 +0000 UTC" firstStartedPulling="2026-01-20 18:47:17.580255231 +0000 UTC m=+1030.502068255" lastFinishedPulling="2026-01-20 18:47:59.194458165 +0000 UTC m=+1072.116271189" observedRunningTime="2026-01-20 18:48:00.233827796 +0000 UTC m=+1073.155640820" watchObservedRunningTime="2026-01-20 18:48:00.236553832 +0000 UTC m=+1073.158366846" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.301888 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.035947434 podStartE2EDuration="3.301862896s" podCreationTimestamp="2026-01-20 18:47:57 +0000 UTC" firstStartedPulling="2026-01-20 18:47:58.483152341 +0000 UTC m=+1071.404965365" lastFinishedPulling="2026-01-20 18:47:59.749067803 +0000 UTC m=+1072.670880827" observedRunningTime="2026-01-20 18:48:00.287451348 +0000 UTC m=+1073.209264382" watchObservedRunningTime="2026-01-20 18:48:00.301862896 +0000 UTC m=+1073.223675920" Jan 20 18:48:01 crc kubenswrapper[4773]: I0120 18:48:01.228852 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerStarted","Data":"5522656ebb75af39ada2f53370d115ae8f0a73bc77151e606425d9a05af78373"} Jan 20 18:48:01 crc kubenswrapper[4773]: I0120 18:48:01.255576 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371988.599232 podStartE2EDuration="48.255544742s" podCreationTimestamp="2026-01-20 18:47:13 +0000 UTC" firstStartedPulling="2026-01-20 18:47:15.588351512 +0000 UTC m=+1028.510164536" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:01.246786081 +0000 UTC m=+1074.168599115" watchObservedRunningTime="2026-01-20 18:48:01.255544742 +0000 UTC m=+1074.177357786" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.122878 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.124209 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.129324 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.133778 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.203347 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.203452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.304816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.304916 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.306099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.324649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.444919 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.878573 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:03 crc kubenswrapper[4773]: I0120 18:48:03.242515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnds2" event={"ID":"ce796025-4e2f-439c-9fab-20c8295a792c","Type":"ContainerStarted","Data":"b80da81735f34173a89a48a7d319907472e0d78062e31cce94b57e8d8226a307"} Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.252469 4773 generic.go:334] "Generic (PLEG): container finished" podID="ce796025-4e2f-439c-9fab-20c8295a792c" containerID="fe986dbc9aa7abb1946cbbaf36610eba367f6b9655e2f6cf2645119cfbe827cd" exitCode=0 Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.252532 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnds2" event={"ID":"ce796025-4e2f-439c-9fab-20c8295a792c","Type":"ContainerDied","Data":"fe986dbc9aa7abb1946cbbaf36610eba367f6b9655e2f6cf2645119cfbe827cd"} Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.827027 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.828536 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.839510 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.868066 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.868122 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.926053 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.927196 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.933592 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.934444 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.949681 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.949738 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.969858 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.051881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.052321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.052381 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.052459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.053206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.070571 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.143984 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.154269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.154358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.155063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.167302 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.168302 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.173868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.185845 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.243080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.287237 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.288241 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.292225 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.299321 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.303108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.357254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.357384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.390964 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458439 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458526 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458655 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.463354 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.477347 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.527655 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.529882 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.535141 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.568110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.568171 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.569997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.594810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.597170 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.598431 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.600890 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.611723 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.643161 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.650860 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:48:05 crc kubenswrapper[4773]: W0120 18:48:05.661806 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba7d65a8_afba_4e1a_a7e6_b9483c97fdcb.slice/crio-a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9 WatchSource:0}: Error finding container a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9: Status 404 returned error can't find the container with id a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9 Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.661885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.672391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.672453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.694565 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774205 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774222 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.775001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.794828 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.811174 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.854604 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.857993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: W0120 18:48:05.873283 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6e707f5_41a8_43c6_976a_7a9645c0b0ca.slice/crio-c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841 WatchSource:0}: Error finding container c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841: Status 404 returned error can't find the container with id c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841 Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.875485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"ce796025-4e2f-439c-9fab-20c8295a792c\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.875583 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"ce796025-4e2f-439c-9fab-20c8295a792c\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.875952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.876032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.876366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce796025-4e2f-439c-9fab-20c8295a792c" (UID: "ce796025-4e2f-439c-9fab-20c8295a792c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.880692 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq" (OuterVolumeSpecName: "kube-api-access-5s2gq") pod "ce796025-4e2f-439c-9fab-20c8295a792c" (UID: "ce796025-4e2f-439c-9fab-20c8295a792c"). InnerVolumeSpecName "kube-api-access-5s2gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.880901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.899506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.925197 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.977993 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.978020 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.039240 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.099827 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.157218 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:48:06 crc kubenswrapper[4773]: W0120 18:48:06.168819 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c64cf4d_562e_4a78_a22b_d682436d5db3.slice/crio-e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827 WatchSource:0}: Error finding container e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827: Status 404 returned error can't find the container with id e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827 Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.196048 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.285172 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnds2" event={"ID":"ce796025-4e2f-439c-9fab-20c8295a792c","Type":"ContainerDied","Data":"b80da81735f34173a89a48a7d319907472e0d78062e31cce94b57e8d8226a307"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.285514 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80da81735f34173a89a48a7d319907472e0d78062e31cce94b57e8d8226a307" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.285183 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.290670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerStarted","Data":"bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.290719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerStarted","Data":"c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.306645 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-fb33-account-create-update-2nkdm" podStartSLOduration=2.306625366 podStartE2EDuration="2.306625366s" podCreationTimestamp="2026-01-20 18:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:06.305913319 +0000 UTC m=+1079.227726343" watchObservedRunningTime="2026-01-20 18:48:06.306625366 +0000 UTC m=+1079.228438390" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.309263 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerID="e203a211f93f27ff720239411e192a2a8202e1fdc890ba783dd23386fddbb4d9" exitCode=0 Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.309325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xjqwr" event={"ID":"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb","Type":"ContainerDied","Data":"e203a211f93f27ff720239411e192a2a8202e1fdc890ba783dd23386fddbb4d9"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.309350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xjqwr" event={"ID":"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb","Type":"ContainerStarted","Data":"a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.310667 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8pd22" event={"ID":"0c64cf4d-562e-4a78-a22b-d682436d5db3","Type":"ContainerStarted","Data":"e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.311750 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" containerID="cri-o://d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" gracePeriod=10 Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.311947 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f756-account-create-update-tlxkm" event={"ID":"484e46fc-ebda-496a-9884-295fcd065e9b","Type":"ContainerStarted","Data":"6c0ddb1490bd747f1bba344e9687839219cf6cfc969f7344d057f8811585e3e5"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.389253 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.450834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.727871 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891396 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891719 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.897053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk" (OuterVolumeSpecName: "kube-api-access-588tk") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "kube-api-access-588tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.930235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config" (OuterVolumeSpecName: "config") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.932501 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.938591 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.944211 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994037 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994080 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994089 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994100 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.321031 4773 generic.go:334] "Generic (PLEG): container finished" podID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerID="7780978bfbf7070fdc6e2326036f3be82707f66d864695cb582db2f78e403bd9" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.321098 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18ee-account-create-update-llcxn" event={"ID":"2ce8f955-26cb-4860-afc1-effceac1d7a4","Type":"ContainerDied","Data":"7780978bfbf7070fdc6e2326036f3be82707f66d864695cb582db2f78e403bd9"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.321125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18ee-account-create-update-llcxn" event={"ID":"2ce8f955-26cb-4860-afc1-effceac1d7a4","Type":"ContainerStarted","Data":"203c8b5e2b96079b41f58f9363695e47c90260d8892717c1920fb47fa147685e"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.323032 4773 generic.go:334] "Generic (PLEG): container finished" podID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerID="bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.323214 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerDied","Data":"bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.324996 4773 generic.go:334] "Generic (PLEG): container finished" podID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325052 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerDied","Data":"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325077 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerDied","Data":"0ac79f3c39c06ac9f58156f40c2aaee8557c1dbe4bc8966a0b68c584c793d8d2"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325097 4773 scope.go:117] "RemoveContainer" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325235 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.328200 4773 generic.go:334] "Generic (PLEG): container finished" podID="484e46fc-ebda-496a-9884-295fcd065e9b" containerID="c06ffe1452b6a2d6f74722f0b7b71c4f2ce4d5613dc36070b3f5358e09162f2f" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.328252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f756-account-create-update-tlxkm" event={"ID":"484e46fc-ebda-496a-9884-295fcd065e9b","Type":"ContainerDied","Data":"c06ffe1452b6a2d6f74722f0b7b71c4f2ce4d5613dc36070b3f5358e09162f2f"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.330414 4773 generic.go:334] "Generic (PLEG): container finished" podID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerID="100cf16d899578784656d12cf1cb2bef2afdb76869ab58de6601f1ccfb0932d7" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.330466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8pd22" event={"ID":"0c64cf4d-562e-4a78-a22b-d682436d5db3","Type":"ContainerDied","Data":"100cf16d899578784656d12cf1cb2bef2afdb76869ab58de6601f1ccfb0932d7"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.333924 4773 generic.go:334] "Generic (PLEG): container finished" podID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerID="2f81f4ca58be86ce8c8a188542774e148445a3fd02682f00bd51696f895c5fe9" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.333965 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8bf57" event={"ID":"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea","Type":"ContainerDied","Data":"2f81f4ca58be86ce8c8a188542774e148445a3fd02682f00bd51696f895c5fe9"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.334002 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8bf57" event={"ID":"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea","Type":"ContainerStarted","Data":"cbcc7f45bbe486d6aa7469123e73614cc19e376d6478837580567a98981cf11b"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.352259 4773 scope.go:117] "RemoveContainer" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.385246 4773 scope.go:117] "RemoveContainer" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" Jan 20 18:48:07 crc kubenswrapper[4773]: E0120 18:48:07.385894 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea\": container with ID starting with d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea not found: ID does not exist" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.386005 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea"} err="failed to get container status \"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea\": rpc error: code = NotFound desc = could not find container \"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea\": container with ID starting with d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea not found: ID does not exist" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.386072 4773 scope.go:117] "RemoveContainer" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" Jan 20 18:48:07 crc kubenswrapper[4773]: E0120 18:48:07.386620 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb\": container with ID starting with 3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb not found: ID does not exist" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.386647 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb"} err="failed to get container status \"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb\": rpc error: code = NotFound desc = could not find container \"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb\": container with ID starting with 3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb not found: ID does not exist" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.422813 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.429517 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.461496 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" path="/var/lib/kubelet/pods/f7c93b98-cee9-4ca4-af53-0a939fece59b/volumes" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.636215 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.808370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.808720 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.809324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" (UID: "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.812187 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw" (OuterVolumeSpecName: "kube-api-access-qnkqw") pod "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" (UID: "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb"). InnerVolumeSpecName "kube-api-access-qnkqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.910170 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.910206 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.346833 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.346836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xjqwr" event={"ID":"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb","Type":"ContainerDied","Data":"a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9"} Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.346892 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.698954 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.826732 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"2ce8f955-26cb-4860-afc1-effceac1d7a4\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.826791 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"2ce8f955-26cb-4860-afc1-effceac1d7a4\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.828088 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ce8f955-26cb-4860-afc1-effceac1d7a4" (UID: "2ce8f955-26cb-4860-afc1-effceac1d7a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.832190 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g" (OuterVolumeSpecName: "kube-api-access-9cm6g") pod "2ce8f955-26cb-4860-afc1-effceac1d7a4" (UID: "2ce8f955-26cb-4860-afc1-effceac1d7a4"). InnerVolumeSpecName "kube-api-access-9cm6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.885952 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.891235 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.907877 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.915614 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.929127 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.929158 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.029858 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.029922 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.029983 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030057 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"0c64cf4d-562e-4a78-a22b-d682436d5db3\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030092 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"484e46fc-ebda-496a-9884-295fcd065e9b\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"0c64cf4d-562e-4a78-a22b-d682436d5db3\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030144 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"484e46fc-ebda-496a-9884-295fcd065e9b\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030826 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "484e46fc-ebda-496a-9884-295fcd065e9b" (UID: "484e46fc-ebda-496a-9884-295fcd065e9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" (UID: "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.031081 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c64cf4d-562e-4a78-a22b-d682436d5db3" (UID: "0c64cf4d-562e-4a78-a22b-d682436d5db3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.031217 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6e707f5-41a8-43c6-976a-7a9645c0b0ca" (UID: "c6e707f5-41a8-43c6-976a-7a9645c0b0ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.033593 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds" (OuterVolumeSpecName: "kube-api-access-nqzds") pod "0c64cf4d-562e-4a78-a22b-d682436d5db3" (UID: "0c64cf4d-562e-4a78-a22b-d682436d5db3"). InnerVolumeSpecName "kube-api-access-nqzds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.033816 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85" (OuterVolumeSpecName: "kube-api-access-7wl85") pod "c6e707f5-41a8-43c6-976a-7a9645c0b0ca" (UID: "c6e707f5-41a8-43c6-976a-7a9645c0b0ca"). InnerVolumeSpecName "kube-api-access-7wl85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.034056 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7" (OuterVolumeSpecName: "kube-api-access-fdgx7") pod "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" (UID: "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea"). InnerVolumeSpecName "kube-api-access-fdgx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.035714 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d" (OuterVolumeSpecName: "kube-api-access-p9d9d") pod "484e46fc-ebda-496a-9884-295fcd065e9b" (UID: "484e46fc-ebda-496a-9884-295fcd065e9b"). InnerVolumeSpecName "kube-api-access-p9d9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132722 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132800 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132822 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132843 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132907 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132952 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132970 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132986 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.359096 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.359097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerDied","Data":"c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.359379 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.361119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f756-account-create-update-tlxkm" event={"ID":"484e46fc-ebda-496a-9884-295fcd065e9b","Type":"ContainerDied","Data":"6c0ddb1490bd747f1bba344e9687839219cf6cfc969f7344d057f8811585e3e5"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.361170 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0ddb1490bd747f1bba344e9687839219cf6cfc969f7344d057f8811585e3e5" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.361327 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.365685 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8pd22" event={"ID":"0c64cf4d-562e-4a78-a22b-d682436d5db3","Type":"ContainerDied","Data":"e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.365718 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.365772 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.375001 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8bf57" event={"ID":"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea","Type":"ContainerDied","Data":"cbcc7f45bbe486d6aa7469123e73614cc19e376d6478837580567a98981cf11b"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.375041 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbcc7f45bbe486d6aa7469123e73614cc19e376d6478837580567a98981cf11b" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.375047 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.376261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18ee-account-create-update-llcxn" event={"ID":"2ce8f955-26cb-4860-afc1-effceac1d7a4","Type":"ContainerDied","Data":"203c8b5e2b96079b41f58f9363695e47c90260d8892717c1920fb47fa147685e"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.376296 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203c8b5e2b96079b41f58f9363695e47c90260d8892717c1920fb47fa147685e" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.376357 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.860473 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861114 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="init" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861128 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="init" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861137 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861144 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861168 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861175 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861199 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861205 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861215 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861221 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861228 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861234 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861249 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861255 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861270 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861275 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861285 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861290 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861433 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861448 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861458 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861466 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861474 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861482 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861492 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861504 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.862034 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.864313 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vtkh" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.866049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.869272 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.959551 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.959606 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.959625 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.960010 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.061561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.061650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.061833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.062194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.065881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.068498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.078782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.083573 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.180295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.658940 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:48:11 crc kubenswrapper[4773]: W0120 18:48:11.672233 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa7530e2_53e5_4891_9a0e_ff23ee1c61bc.slice/crio-dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5 WatchSource:0}: Error finding container dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5: Status 404 returned error can't find the container with id dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5 Jan 20 18:48:12 crc kubenswrapper[4773]: I0120 18:48:12.396989 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerStarted","Data":"dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5"} Jan 20 18:48:12 crc kubenswrapper[4773]: I0120 18:48:12.927997 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.534373 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.540746 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.613041 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.614886 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.618175 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.622048 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.712146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.712221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.814092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.814173 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.814974 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.832743 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.933598 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:14 crc kubenswrapper[4773]: I0120 18:48:14.342006 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:48:14 crc kubenswrapper[4773]: I0120 18:48:14.414531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j46db" event={"ID":"7f5455e9-7072-4154-b881-75a1da2c0466","Type":"ContainerStarted","Data":"428521eea85e77dc9f56997d37b1956f83674b7ba962c44955ed01dddfe8cfbe"} Jan 20 18:48:15 crc kubenswrapper[4773]: I0120 18:48:15.421801 4773 generic.go:334] "Generic (PLEG): container finished" podID="7f5455e9-7072-4154-b881-75a1da2c0466" containerID="7ac50ea7174c7d5687e310f246288e88881d9a99f2dc7333211966358f9e13de" exitCode=0 Jan 20 18:48:15 crc kubenswrapper[4773]: I0120 18:48:15.421969 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j46db" event={"ID":"7f5455e9-7072-4154-b881-75a1da2c0466","Type":"ContainerDied","Data":"7ac50ea7174c7d5687e310f246288e88881d9a99f2dc7333211966358f9e13de"} Jan 20 18:48:15 crc kubenswrapper[4773]: I0120 18:48:15.455738 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" path="/var/lib/kubelet/pods/ce796025-4e2f-439c-9fab-20c8295a792c/volumes" Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.187100 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-t5h8j" podUID="2fce4eb9-f614-4050-a099-0a743695dcd9" containerName="ovn-controller" probeResult="failure" output=< Jan 20 18:48:21 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 20 18:48:21 crc kubenswrapper[4773]: > Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.487464 4773 generic.go:334] "Generic (PLEG): container finished" podID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" exitCode=0 Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.487509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerDied","Data":"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f"} Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.492134 4773 generic.go:334] "Generic (PLEG): container finished" podID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerID="582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb" exitCode=0 Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.492179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerDied","Data":"582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb"} Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.360710 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.458773 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"7f5455e9-7072-4154-b881-75a1da2c0466\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.458840 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"7f5455e9-7072-4154-b881-75a1da2c0466\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.459568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f5455e9-7072-4154-b881-75a1da2c0466" (UID: "7f5455e9-7072-4154-b881-75a1da2c0466"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.462474 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x" (OuterVolumeSpecName: "kube-api-access-6fq7x") pod "7f5455e9-7072-4154-b881-75a1da2c0466" (UID: "7f5455e9-7072-4154-b881-75a1da2c0466"). InnerVolumeSpecName "kube-api-access-6fq7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.499799 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j46db" event={"ID":"7f5455e9-7072-4154-b881-75a1da2c0466","Type":"ContainerDied","Data":"428521eea85e77dc9f56997d37b1956f83674b7ba962c44955ed01dddfe8cfbe"} Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.499836 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="428521eea85e77dc9f56997d37b1956f83674b7ba962c44955ed01dddfe8cfbe" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.499890 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.560609 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.560644 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.508493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerStarted","Data":"80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616"} Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.512341 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerStarted","Data":"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647"} Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.512613 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.514902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerStarted","Data":"7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897"} Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.515196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.527758 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-29z4h" podStartSLOduration=2.776939087 podStartE2EDuration="13.527738697s" podCreationTimestamp="2026-01-20 18:48:10 +0000 UTC" firstStartedPulling="2026-01-20 18:48:11.673777718 +0000 UTC m=+1084.595590752" lastFinishedPulling="2026-01-20 18:48:22.424577338 +0000 UTC m=+1095.346390362" observedRunningTime="2026-01-20 18:48:23.524633813 +0000 UTC m=+1096.446446847" watchObservedRunningTime="2026-01-20 18:48:23.527738697 +0000 UTC m=+1096.449551721" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.562725 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.483973342 podStartE2EDuration="1m13.5627014s" podCreationTimestamp="2026-01-20 18:47:10 +0000 UTC" firstStartedPulling="2026-01-20 18:47:13.34618256 +0000 UTC m=+1026.267995584" lastFinishedPulling="2026-01-20 18:47:46.424910618 +0000 UTC m=+1059.346723642" observedRunningTime="2026-01-20 18:48:23.549705186 +0000 UTC m=+1096.471518211" watchObservedRunningTime="2026-01-20 18:48:23.5627014 +0000 UTC m=+1096.484514434" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.591209 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.986218316 podStartE2EDuration="1m13.591193206s" podCreationTimestamp="2026-01-20 18:47:10 +0000 UTC" firstStartedPulling="2026-01-20 18:47:13.001888801 +0000 UTC m=+1025.923701825" lastFinishedPulling="2026-01-20 18:47:45.606863691 +0000 UTC m=+1058.528676715" observedRunningTime="2026-01-20 18:48:23.574225537 +0000 UTC m=+1096.496038581" watchObservedRunningTime="2026-01-20 18:48:23.591193206 +0000 UTC m=+1096.513006230" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.188611 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-t5h8j" podUID="2fce4eb9-f614-4050-a099-0a743695dcd9" containerName="ovn-controller" probeResult="failure" output=< Jan 20 18:48:26 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 20 18:48:26 crc kubenswrapper[4773]: > Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.209151 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.210792 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.453441 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:26 crc kubenswrapper[4773]: E0120 18:48:26.453847 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" containerName="mariadb-account-create-update" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.453873 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" containerName="mariadb-account-create-update" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.454082 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" containerName="mariadb-account-create-update" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.454663 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.456612 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.469393 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536300 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536440 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536465 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536616 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638524 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638635 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638653 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.639043 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.639077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.640026 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.641029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.659811 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.774828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:27 crc kubenswrapper[4773]: W0120 18:48:27.020225 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53283907_9a5d_4568_9db6_bce4357ad6a4.slice/crio-bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379 WatchSource:0}: Error finding container bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379: Status 404 returned error can't find the container with id bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379 Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.030538 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.545888 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerStarted","Data":"ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826"} Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.547070 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerStarted","Data":"bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379"} Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.563856 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-t5h8j-config-47pnf" podStartSLOduration=1.563834946 podStartE2EDuration="1.563834946s" podCreationTimestamp="2026-01-20 18:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:27.558429697 +0000 UTC m=+1100.480242721" watchObservedRunningTime="2026-01-20 18:48:27.563834946 +0000 UTC m=+1100.485647970" Jan 20 18:48:28 crc kubenswrapper[4773]: I0120 18:48:28.554217 4773 generic.go:334] "Generic (PLEG): container finished" podID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerID="ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826" exitCode=0 Jan 20 18:48:28 crc kubenswrapper[4773]: I0120 18:48:28.554312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerDied","Data":"ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826"} Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.864165 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990132 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990239 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990267 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990318 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run" (OuterVolumeSpecName: "var-run") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990424 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990562 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990592 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990593 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991266 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991361 4773 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991383 4773 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991398 4773 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991408 4773 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991478 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts" (OuterVolumeSpecName: "scripts") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.998299 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l" (OuterVolumeSpecName: "kube-api-access-5bs7l") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "kube-api-access-5bs7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.093030 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.093063 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.569711 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerDied","Data":"bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379"} Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.569759 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.569788 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.594724 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.602059 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.693574 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:30 crc kubenswrapper[4773]: E0120 18:48:30.693867 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerName="ovn-config" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.693884 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerName="ovn-config" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.694073 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerName="ovn-config" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.695083 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.697963 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.709323 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801400 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801607 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801636 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801661 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903195 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903218 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903263 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903529 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.904372 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.904442 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.905650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.921186 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.010293 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.205619 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-t5h8j" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.465134 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" path="/var/lib/kubelet/pods/53283907-9a5d-4568-9db6-bce4357ad6a4/volumes" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.483216 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.577195 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-5522t" event={"ID":"5bb055c2-7ce4-425c-8e65-df8438bde346","Type":"ContainerStarted","Data":"abad885cb9fdc1000de82a7a346e1b71acb34383bdf5e8369f0fed08971ac5b2"} Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.584588 4773 generic.go:334] "Generic (PLEG): container finished" podID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerID="fba20ba934791c753f7de5893c3aaef399510fc1a1206ee1163905e05a43e6b4" exitCode=0 Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.584633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-5522t" event={"ID":"5bb055c2-7ce4-425c-8e65-df8438bde346","Type":"ContainerDied","Data":"fba20ba934791c753f7de5893c3aaef399510fc1a1206ee1163905e05a43e6b4"} Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.619361 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.629095 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 20 18:48:33 crc kubenswrapper[4773]: I0120 18:48:33.592663 4773 generic.go:334] "Generic (PLEG): container finished" podID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerID="80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616" exitCode=0 Jan 20 18:48:33 crc kubenswrapper[4773]: I0120 18:48:33.592760 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerDied","Data":"80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616"} Jan 20 18:48:33 crc kubenswrapper[4773]: I0120 18:48:33.901968 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072055 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072145 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072176 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run" (OuterVolumeSpecName: "var-run") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072240 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072282 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072337 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072364 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072403 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072760 4773 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072783 4773 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072794 4773 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.073250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.073811 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts" (OuterVolumeSpecName: "scripts") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.080059 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs" (OuterVolumeSpecName: "kube-api-access-vwfbs") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "kube-api-access-vwfbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.174063 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.174375 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.174385 4773 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.381179 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:48:34 crc kubenswrapper[4773]: E0120 18:48:34.381586 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerName="ovn-config" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.381609 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerName="ovn-config" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.381813 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerName="ovn-config" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.382470 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.391814 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.478865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.479184 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.494864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.495850 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.518960 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.520246 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.525615 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.529792 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.555663 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580805 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580922 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.582029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.616022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.649569 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.650641 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.656224 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.658046 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.658097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-5522t" event={"ID":"5bb055c2-7ce4-425c-8e65-df8438bde346","Type":"ContainerDied","Data":"abad885cb9fdc1000de82a7a346e1b71acb34383bdf5e8369f0fed08971ac5b2"} Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.658134 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abad885cb9fdc1000de82a7a346e1b71acb34383bdf5e8369f0fed08971ac5b2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.665592 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682697 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682758 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.683669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.698906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.723744 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784340 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784673 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784777 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.785440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.800908 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.801900 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.805818 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806442 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806553 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-24qqg" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806749 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806761 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.818219 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.824391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.858326 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888401 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888672 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888726 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888746 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.889622 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.912576 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.916337 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.934506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.942066 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.957223 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.958249 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.962349 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995621 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.020856 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.021569 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.029388 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.031244 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.055295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.075543 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.088341 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096850 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096979 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.097754 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.115751 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.121281 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.198924 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.199370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.200247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.227313 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.230510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300169 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300283 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300404 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.303079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.304326 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.305659 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd" (OuterVolumeSpecName: "kube-api-access-nxckd") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "kube-api-access-nxckd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.326143 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.346207 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data" (OuterVolumeSpecName: "config-data") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.352604 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.368407 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402107 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402134 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402145 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402154 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.424545 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.474187 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" path="/var/lib/kubelet/pods/5bb055c2-7ce4-425c-8e65-df8438bde346/volumes" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.552358 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:48:35 crc kubenswrapper[4773]: W0120 18:48:35.564208 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb313ef44_3ec0_4e2e_bc88_0187cce26783.slice/crio-b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c WatchSource:0}: Error finding container b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c: Status 404 returned error can't find the container with id b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.632130 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.689003 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.689494 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerDied","Data":"dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.689524 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.695468 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerStarted","Data":"a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.695533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerStarted","Data":"a955e92c98631ae57ff0855cb259b662175e93be87674e4678a228c77bf3a1ff"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.698995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerStarted","Data":"c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.699037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerStarted","Data":"d4b834509066ebd18893c1c72ca8fe3d7ef9e80cb324adea5f6be02adb6f303d"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.701035 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b6ae-account-create-update-xdwz2" event={"ID":"b313ef44-3ec0-4e2e-bc88-0187cce26783","Type":"ContainerStarted","Data":"b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.731852 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.741252 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-fldlp" podStartSLOduration=1.7412296710000001 podStartE2EDuration="1.741229671s" podCreationTimestamp="2026-01-20 18:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:35.718143878 +0000 UTC m=+1108.639956902" watchObservedRunningTime="2026-01-20 18:48:35.741229671 +0000 UTC m=+1108.663042695" Jan 20 18:48:35 crc kubenswrapper[4773]: W0120 18:48:35.750579 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2544d2a_4467_4356_9aee_21a75f6efedc.slice/crio-d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7 WatchSource:0}: Error finding container d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7: Status 404 returned error can't find the container with id d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7 Jan 20 18:48:35 crc kubenswrapper[4773]: W0120 18:48:35.751237 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d41a48_da79_4b93_bf84_ab8b94fed1c1.slice/crio-25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994 WatchSource:0}: Error finding container 25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994: Status 404 returned error can't find the container with id 25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994 Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.752953 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-jqhz4" podStartSLOduration=1.75291328 podStartE2EDuration="1.75291328s" podCreationTimestamp="2026-01-20 18:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:35.732695267 +0000 UTC m=+1108.654508291" watchObservedRunningTime="2026-01-20 18:48:35.75291328 +0000 UTC m=+1108.674726304" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.766181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.816374 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.015839 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:48:36 crc kubenswrapper[4773]: E0120 18:48:36.016433 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerName="glance-db-sync" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.016449 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerName="glance-db-sync" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.016670 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerName="glance-db-sync" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.017437 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.076817 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118326 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118374 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118432 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118916 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221012 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221072 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221126 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.222535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.222672 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.223046 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.223101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.239173 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.346756 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.599223 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:48:36 crc kubenswrapper[4773]: W0120 18:48:36.607012 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd326b299_f619_4c76_9a10_045d77fa9bae.slice/crio-4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee WatchSource:0}: Error finding container 4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee: Status 404 returned error can't find the container with id 4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.717363 4773 generic.go:334] "Generic (PLEG): container finished" podID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerID="59d2c461099d25c608c6562b9d212406fcc710ae864054a0764c29095622613a" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.717802 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-58b0-account-create-update-n4bl6" event={"ID":"181581ac-d6d3-4700-bfb7-7179a262a27c","Type":"ContainerDied","Data":"59d2c461099d25c608c6562b9d212406fcc710ae864054a0764c29095622613a"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.717828 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-58b0-account-create-update-n4bl6" event={"ID":"181581ac-d6d3-4700-bfb7-7179a262a27c","Type":"ContainerStarted","Data":"4e7e0970af3231a70982a3028b68071ffa3e49fb1a044ec38e898ff030b9ff54"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.719228 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerStarted","Data":"25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.720838 4773 generic.go:334] "Generic (PLEG): container finished" podID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerID="c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.720898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerDied","Data":"c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.723251 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerID="17b8b7c8cb845ba0251348f763aac9652f97d99f1d3fb0947416ad8e58f06104" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.723294 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5450-account-create-update-m7kr7" event={"ID":"b2544d2a-4467-4356-9aee-21a75f6efedc","Type":"ContainerDied","Data":"17b8b7c8cb845ba0251348f763aac9652f97d99f1d3fb0947416ad8e58f06104"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.723349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5450-account-create-update-m7kr7" event={"ID":"b2544d2a-4467-4356-9aee-21a75f6efedc","Type":"ContainerStarted","Data":"d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.725089 4773 generic.go:334] "Generic (PLEG): container finished" podID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerID="3fe035cd5db85387fabc74e84605df50a425cce1a8ad3c3850fcd55fb4b1eaa6" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.725736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b6ae-account-create-update-xdwz2" event={"ID":"b313ef44-3ec0-4e2e-bc88-0187cce26783","Type":"ContainerDied","Data":"3fe035cd5db85387fabc74e84605df50a425cce1a8ad3c3850fcd55fb4b1eaa6"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.731134 4773 generic.go:334] "Generic (PLEG): container finished" podID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerID="19b2a1461c2e62cae82675b27637cd9300c36d95cb1554d31376193faaa94e3d" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.731201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sg7w8" event={"ID":"b742ea09-e1ce-4311-a9bf-7736d3ab235c","Type":"ContainerDied","Data":"19b2a1461c2e62cae82675b27637cd9300c36d95cb1554d31376193faaa94e3d"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.731230 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sg7w8" event={"ID":"b742ea09-e1ce-4311-a9bf-7736d3ab235c","Type":"ContainerStarted","Data":"2dcef42d2cba924a7584b09119514eaa3ae2b310ea90a1ebbf7d1a3429ad34aa"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.735544 4773 generic.go:334] "Generic (PLEG): container finished" podID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerID="a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.735609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerDied","Data":"a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.737414 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerStarted","Data":"4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee"} Jan 20 18:48:37 crc kubenswrapper[4773]: I0120 18:48:37.750255 4773 generic.go:334] "Generic (PLEG): container finished" podID="d326b299-f619-4c76-9a10-045d77fa9bae" containerID="526b5b595c5084d4d46dcd86dc9c0555e27f3a5f70cfb4f516507eaf64968118" exitCode=0 Jan 20 18:48:37 crc kubenswrapper[4773]: I0120 18:48:37.750355 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerDied","Data":"526b5b595c5084d4d46dcd86dc9c0555e27f3a5f70cfb4f516507eaf64968118"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.617146 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.688330 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.695637 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"b2544d2a-4467-4356-9aee-21a75f6efedc\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.695741 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"b2544d2a-4467-4356-9aee-21a75f6efedc\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.699331 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz" (OuterVolumeSpecName: "kube-api-access-lmthz") pod "b2544d2a-4467-4356-9aee-21a75f6efedc" (UID: "b2544d2a-4467-4356-9aee-21a75f6efedc"). InnerVolumeSpecName "kube-api-access-lmthz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.701328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2544d2a-4467-4356-9aee-21a75f6efedc" (UID: "b2544d2a-4467-4356-9aee-21a75f6efedc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.712055 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.760688 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.770254 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.777872 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.782012 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.782154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-58b0-account-create-update-n4bl6" event={"ID":"181581ac-d6d3-4700-bfb7-7179a262a27c","Type":"ContainerDied","Data":"4e7e0970af3231a70982a3028b68071ffa3e49fb1a044ec38e898ff030b9ff54"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.782200 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e7e0970af3231a70982a3028b68071ffa3e49fb1a044ec38e898ff030b9ff54" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.783449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerDied","Data":"d4b834509066ebd18893c1c72ca8fe3d7ef9e80cb324adea5f6be02adb6f303d"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.783518 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b834509066ebd18893c1c72ca8fe3d7ef9e80cb324adea5f6be02adb6f303d" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.783492 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.784452 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5450-account-create-update-m7kr7" event={"ID":"b2544d2a-4467-4356-9aee-21a75f6efedc","Type":"ContainerDied","Data":"d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.784484 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.784531 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.786195 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.786191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b6ae-account-create-update-xdwz2" event={"ID":"b313ef44-3ec0-4e2e-bc88-0187cce26783","Type":"ContainerDied","Data":"b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.786257 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.787348 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sg7w8" event={"ID":"b742ea09-e1ce-4311-a9bf-7736d3ab235c","Type":"ContainerDied","Data":"2dcef42d2cba924a7584b09119514eaa3ae2b310ea90a1ebbf7d1a3429ad34aa"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.787384 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcef42d2cba924a7584b09119514eaa3ae2b310ea90a1ebbf7d1a3429ad34aa" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.787392 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.788369 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerDied","Data":"a955e92c98631ae57ff0855cb259b662175e93be87674e4678a228c77bf3a1ff"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.788388 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.788390 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a955e92c98631ae57ff0855cb259b662175e93be87674e4678a228c77bf3a1ff" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801329 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801412 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"be215ecb-8014-4db1-8eac-59f0d3dee870\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"be215ecb-8014-4db1-8eac-59f0d3dee870\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801710 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.802761 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.802803 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.803184 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be215ecb-8014-4db1-8eac-59f0d3dee870" (UID: "be215ecb-8014-4db1-8eac-59f0d3dee870"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.804034 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b742ea09-e1ce-4311-a9bf-7736d3ab235c" (UID: "b742ea09-e1ce-4311-a9bf-7736d3ab235c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.804079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d813dade-efd1-404d-ae3f-ecea71ffb5ee" (UID: "d813dade-efd1-404d-ae3f-ecea71ffb5ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.804584 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph" (OuterVolumeSpecName: "kube-api-access-2lwph") pod "d813dade-efd1-404d-ae3f-ecea71ffb5ee" (UID: "d813dade-efd1-404d-ae3f-ecea71ffb5ee"). InnerVolumeSpecName "kube-api-access-2lwph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.808944 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r" (OuterVolumeSpecName: "kube-api-access-c9p9r") pod "b742ea09-e1ce-4311-a9bf-7736d3ab235c" (UID: "b742ea09-e1ce-4311-a9bf-7736d3ab235c"). InnerVolumeSpecName "kube-api-access-c9p9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.813281 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq" (OuterVolumeSpecName: "kube-api-access-hcrjq") pod "be215ecb-8014-4db1-8eac-59f0d3dee870" (UID: "be215ecb-8014-4db1-8eac-59f0d3dee870"). InnerVolumeSpecName "kube-api-access-hcrjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.903666 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"181581ac-d6d3-4700-bfb7-7179a262a27c\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.903761 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"181581ac-d6d3-4700-bfb7-7179a262a27c\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.903846 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"b313ef44-3ec0-4e2e-bc88-0187cce26783\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904000 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"b313ef44-3ec0-4e2e-bc88-0187cce26783\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "181581ac-d6d3-4700-bfb7-7179a262a27c" (UID: "181581ac-d6d3-4700-bfb7-7179a262a27c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904494 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904524 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904534 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904542 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904551 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904559 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904567 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b313ef44-3ec0-4e2e-bc88-0187cce26783" (UID: "b313ef44-3ec0-4e2e-bc88-0187cce26783"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.906995 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d" (OuterVolumeSpecName: "kube-api-access-tdx2d") pod "b313ef44-3ec0-4e2e-bc88-0187cce26783" (UID: "b313ef44-3ec0-4e2e-bc88-0187cce26783"). InnerVolumeSpecName "kube-api-access-tdx2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.907510 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv" (OuterVolumeSpecName: "kube-api-access-jqfjv") pod "181581ac-d6d3-4700-bfb7-7179a262a27c" (UID: "181581ac-d6d3-4700-bfb7-7179a262a27c"). InnerVolumeSpecName "kube-api-access-jqfjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.006090 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.006130 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.006142 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.796655 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerStarted","Data":"1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e"} Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.796996 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.799062 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerStarted","Data":"a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649"} Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.827344 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podStartSLOduration=6.827321286 podStartE2EDuration="6.827321286s" podCreationTimestamp="2026-01-20 18:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:41.819972249 +0000 UTC m=+1114.741785273" watchObservedRunningTime="2026-01-20 18:48:41.827321286 +0000 UTC m=+1114.749134310" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.843088 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kmlg7" podStartSLOduration=3.15695669 podStartE2EDuration="7.843066392s" podCreationTimestamp="2026-01-20 18:48:34 +0000 UTC" firstStartedPulling="2026-01-20 18:48:35.754079358 +0000 UTC m=+1108.675892382" lastFinishedPulling="2026-01-20 18:48:40.44018906 +0000 UTC m=+1113.362002084" observedRunningTime="2026-01-20 18:48:41.837748665 +0000 UTC m=+1114.759561689" watchObservedRunningTime="2026-01-20 18:48:41.843066392 +0000 UTC m=+1114.764879416" Jan 20 18:48:44 crc kubenswrapper[4773]: I0120 18:48:44.828799 4773 generic.go:334] "Generic (PLEG): container finished" podID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerID="a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649" exitCode=0 Jan 20 18:48:44 crc kubenswrapper[4773]: I0120 18:48:44.828844 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerDied","Data":"a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649"} Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.163282 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.291572 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.292343 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.292418 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.296836 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk" (OuterVolumeSpecName: "kube-api-access-4p5kk") pod "49d41a48-da79-4b93-bf84-ab8b94fed1c1" (UID: "49d41a48-da79-4b93-bf84-ab8b94fed1c1"). InnerVolumeSpecName "kube-api-access-4p5kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.318689 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d41a48-da79-4b93-bf84-ab8b94fed1c1" (UID: "49d41a48-da79-4b93-bf84-ab8b94fed1c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.335531 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data" (OuterVolumeSpecName: "config-data") pod "49d41a48-da79-4b93-bf84-ab8b94fed1c1" (UID: "49d41a48-da79-4b93-bf84-ab8b94fed1c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.349072 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.395434 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.396152 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.396169 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.411717 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.412026 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" containerID="cri-o://ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207" gracePeriod=10 Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.857763 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerDied","Data":"25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994"} Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.857802 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.857826 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.037945 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038319 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038339 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038350 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038357 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038368 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerName="keystone-db-sync" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038374 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerName="keystone-db-sync" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038387 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038393 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038404 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038410 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038419 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038426 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038436 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038442 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038579 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038599 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038608 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038617 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038624 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerName="keystone-db-sync" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038633 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038641 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.039134 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041110 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041610 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-24qqg" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041701 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041814 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041996 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.052961 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.066577 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.068186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.088116 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111890 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111919 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115075 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115235 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115404 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115450 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115499 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217330 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217423 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217439 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217499 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.218663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.218669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.221643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.222549 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.223440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.226558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.227459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.227471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.237755 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.268005 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.268988 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.275386 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.275596 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fjv2n" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.282996 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.284775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.297394 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.297837 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.298048 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.300354 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.300669 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6p7p9" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.300853 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.309374 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.312828 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.316416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324813 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324852 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324875 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324918 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.364830 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.407988 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.409757 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.416548 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.424860 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rvh6j" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.425435 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426541 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426636 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426798 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426966 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427302 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.433329 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.439066 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.442724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.442874 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.444356 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.484324 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.496308 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.506368 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.507430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.507985 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.514385 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.514992 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.515252 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.515441 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.515639 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-r9k85" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.527170 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.528689 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.528842 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529038 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529217 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529311 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529404 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529490 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.530215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.530525 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.530699 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.531554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.533277 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538426 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7hwmj" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538587 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538727 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.573905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.574002 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.609140 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.611131 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.625882 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632227 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632351 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632393 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632527 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632605 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632628 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632714 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632760 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.633046 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.633102 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.633215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.638701 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.647114 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.661049 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.673722 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.683090 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.715094 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.716437 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.729343 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737074 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737104 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737170 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737204 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737264 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737349 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737381 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737452 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737472 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.738860 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.741283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.742286 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.742415 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.747516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.757310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.763999 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.764745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.764824 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765676 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765720 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.766097 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.766597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.770408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.811392 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838626 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838671 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838717 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838739 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838825 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.839309 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.841538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.841760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.850554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.865100 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.865402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.868001 4773 generic.go:334] "Generic (PLEG): container finished" podID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerID="ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207" exitCode=0 Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.868037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerDied","Data":"ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207"} Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.878822 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.899583 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.940096 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.941044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.941954 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.940157 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.944201 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.944256 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.944286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.945636 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.946282 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.964303 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.982397 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.050864 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.111476 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.116924 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:48 crc kubenswrapper[4773]: W0120 18:48:48.195905 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34ae367_2e63_4e91_8c3f_ed0a2a827607.slice/crio-3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49 WatchSource:0}: Error finding container 3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49: Status 404 returned error can't find the container with id 3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49 Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.463774 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562734 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562822 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562848 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562870 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.578544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l" (OuterVolumeSpecName: "kube-api-access-wbd7l") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "kube-api-access-wbd7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.606494 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.607709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.609150 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config" (OuterVolumeSpecName: "config") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.616032 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665239 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665590 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665603 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665614 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665625 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.689150 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.697310 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.786873 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.798662 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.807364 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:48:48 crc kubenswrapper[4773]: W0120 18:48:48.814076 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec857182_f4b2_46cd_8b7f_fdbc443d8a1a.slice/crio-f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c WatchSource:0}: Error finding container f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c: Status 404 returned error can't find the container with id f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.881538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerStarted","Data":"a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.881584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerStarted","Data":"a72d0b517337335c8270dbc86cc90184c82c27aa6edf416c70a51469398c8f1b"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.882431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerStarted","Data":"b51bc15af24c493f96b3eab7ab99bb433e14f99c2a5d9c6b5812e8fdfec3a8c6"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.885406 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbff4dff5-7w2k2" event={"ID":"2c3949f0-4faa-4935-8d0f-7ce69d8de08d","Type":"ContainerStarted","Data":"3096768a065586cfd8bc32ac4cab442fcebf027a669b7a4ad518113982f3c163"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.889126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerDied","Data":"4264d71b8e07e5eb9a8f6822d3dd22ead9577270d589def4baeb9a9c2e4760f2"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.889165 4773 scope.go:117] "RemoveContainer" containerID="ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.889303 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.904135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerStarted","Data":"f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.906219 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kkb9f" podStartSLOduration=1.9061948370000001 podStartE2EDuration="1.906194837s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:48.904464866 +0000 UTC m=+1121.826277900" watchObservedRunningTime="2026-01-20 18:48:48.906194837 +0000 UTC m=+1121.828007861" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.909389 4773 generic.go:334] "Generic (PLEG): container finished" podID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerID="621615eabffd10d70a89b0b2941e83b4834a489bd87f8668e12ca27326d38aa9" exitCode=0 Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.909497 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" event={"ID":"b34ae367-2e63-4e91-8c3f-ed0a2a827607","Type":"ContainerDied","Data":"621615eabffd10d70a89b0b2941e83b4834a489bd87f8668e12ca27326d38aa9"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.909537 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" event={"ID":"b34ae367-2e63-4e91-8c3f-ed0a2a827607","Type":"ContainerStarted","Data":"3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.914387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerStarted","Data":"aef79cffc84cc91679ae3ca7b132c652022d5b062e4c613e53887b19256d63c1"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.917432 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerStarted","Data":"2518423b901342e321aef89bcd042955aa2c984bc05c23caf50e31eea1f65a99"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.928613 4773 scope.go:117] "RemoveContainer" containerID="ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.964064 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.984997 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.004971 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:48:49 crc kubenswrapper[4773]: W0120 18:48:49.009283 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf667f5ef_cefc_40c0_a282_5d502cd45cd2.slice/crio-454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd WatchSource:0}: Error finding container 454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd: Status 404 returned error can't find the container with id 454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.027712 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.057481 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.133007 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.136763 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191228 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:49 crc kubenswrapper[4773]: E0120 18:48:49.191637 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="init" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191655 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="init" Jan 20 18:48:49 crc kubenswrapper[4773]: E0120 18:48:49.191670 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191677 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191828 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.210892 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.211169 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.280785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281056 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281462 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281567 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.382901 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.382968 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383001 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383031 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.384631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.385666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.388164 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.421493 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.504039 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" path="/var/lib/kubelet/pods/aba9326a-e499-43a8-9f50-4dc29d62c960/volumes" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.544221 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.587958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588061 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.597716 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c" (OuterVolumeSpecName: "kube-api-access-4kv7c") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "kube-api-access-4kv7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.611330 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.615486 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.618225 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config" (OuterVolumeSpecName: "config") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.625537 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.640991 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696597 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696633 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696648 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696661 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696673 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.927912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" event={"ID":"b34ae367-2e63-4e91-8c3f-ed0a2a827607","Type":"ContainerDied","Data":"3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.928242 4773 scope.go:117] "RemoveContainer" containerID="621615eabffd10d70a89b0b2941e83b4834a489bd87f8668e12ca27326d38aa9" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.928374 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.937151 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerStarted","Data":"0b60f30c0a7ab6c9d2a1ba89c628d4ad9f1438793f4c8b9c55bf5b64d977ae2b"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.939157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"d0e8ddb6dbdcbfbf1e26c6a891d80cf5f965501af473ae07fda9dcc295cac646"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.942200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerStarted","Data":"bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.944477 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerStarted","Data":"cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.944511 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerStarted","Data":"454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.978683 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rz89h" podStartSLOduration=2.9786630499999998 podStartE2EDuration="2.97866305s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:49.960827744 +0000 UTC m=+1122.882640768" watchObservedRunningTime="2026-01-20 18:48:49.97866305 +0000 UTC m=+1122.900476074" Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.056876 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.063141 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.079304 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.957877 4773 generic.go:334] "Generic (PLEG): container finished" podID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerID="cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7" exitCode=0 Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.957955 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerDied","Data":"cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7"} Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.960342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864c6579d5-v5vdm" event={"ID":"609b419f-cc52-4fef-aa49-f64cdbba6755","Type":"ContainerStarted","Data":"6a01b8d18b1bc750f52b0b014094ed52c18500c651ec5c625261c7be00925ccf"} Jan 20 18:48:51 crc kubenswrapper[4773]: I0120 18:48:51.458099 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" path="/var/lib/kubelet/pods/b34ae367-2e63-4e91-8c3f-ed0a2a827607/volumes" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.139876 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.166040 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:48:56 crc kubenswrapper[4773]: E0120 18:48:56.166452 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerName="init" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.166478 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerName="init" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.166691 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerName="init" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.167734 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.174034 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.195603 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233569 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233695 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233741 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.234052 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.278565 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.312317 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68fb89f56b-287lx"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.313564 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.327087 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68fb89f56b-287lx"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335352 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335498 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335555 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.336754 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.337149 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.337163 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.344126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.344407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.356448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.358171 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-logs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-tls-certs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437543 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpm9k\" (UniqueName: \"kubernetes.io/projected/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-kube-api-access-mpm9k\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-config-data\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437879 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-scripts\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437897 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-combined-ca-bundle\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437919 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-secret-key\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.539909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-logs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540008 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-tls-certs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpm9k\" (UniqueName: \"kubernetes.io/projected/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-kube-api-access-mpm9k\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-config-data\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-scripts\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540170 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-combined-ca-bundle\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-secret-key\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540470 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-logs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.541699 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-scripts\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.542467 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-config-data\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.547822 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-secret-key\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.548642 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-tls-certs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.554015 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-combined-ca-bundle\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.557282 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpm9k\" (UniqueName: \"kubernetes.io/projected/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-kube-api-access-mpm9k\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.565084 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.630993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:57 crc kubenswrapper[4773]: I0120 18:48:57.007991 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerStarted","Data":"5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d"} Jan 20 18:48:57 crc kubenswrapper[4773]: I0120 18:48:57.009008 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:57 crc kubenswrapper[4773]: I0120 18:48:57.029761 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" podStartSLOduration=10.029745607 podStartE2EDuration="10.029745607s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:57.029152863 +0000 UTC m=+1129.950965907" watchObservedRunningTime="2026-01-20 18:48:57.029745607 +0000 UTC m=+1129.951558631" Jan 20 18:48:58 crc kubenswrapper[4773]: I0120 18:48:58.170167 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:48:58 crc kubenswrapper[4773]: I0120 18:48:58.170441 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:49:00 crc kubenswrapper[4773]: I0120 18:49:00.038023 4773 generic.go:334] "Generic (PLEG): container finished" podID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerID="a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9" exitCode=0 Jan 20 18:49:00 crc kubenswrapper[4773]: I0120 18:49:00.038104 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerDied","Data":"a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9"} Jan 20 18:49:03 crc kubenswrapper[4773]: I0120 18:49:03.053143 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:49:03 crc kubenswrapper[4773]: I0120 18:49:03.100841 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:49:03 crc kubenswrapper[4773]: I0120 18:49:03.101079 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" containerID="cri-o://1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e" gracePeriod=10 Jan 20 18:49:03 crc kubenswrapper[4773]: E0120 18:49:03.814472 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 20 18:49:03 crc kubenswrapper[4773]: E0120 18:49:03.815077 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch65bh5f8h586h9h597h57h88h586hdfh5d7h645h688h89h85h5ffh685h675h5ffh554h5c8hd4h585hbh578h5b5h54ch679h5b5h64ch5dh5c7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xzp99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7bbff4dff5-7w2k2_openstack(2c3949f0-4faa-4935-8d0f-7ce69d8de08d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:03 crc kubenswrapper[4773]: E0120 18:49:03.817520 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7bbff4dff5-7w2k2" podUID="2c3949f0-4faa-4935-8d0f-7ce69d8de08d" Jan 20 18:49:04 crc kubenswrapper[4773]: I0120 18:49:04.068755 4773 generic.go:334] "Generic (PLEG): container finished" podID="d326b299-f619-4c76-9a10-045d77fa9bae" containerID="1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e" exitCode=0 Jan 20 18:49:04 crc kubenswrapper[4773]: I0120 18:49:04.068806 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerDied","Data":"1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e"} Jan 20 18:49:06 crc kubenswrapper[4773]: I0120 18:49:06.347649 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Jan 20 18:49:11 crc kubenswrapper[4773]: I0120 18:49:11.347670 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Jan 20 18:49:11 crc kubenswrapper[4773]: E0120 18:49:11.865191 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 20 18:49:11 crc kubenswrapper[4773]: E0120 18:49:11.865373 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fbh657h594h659h578h95h9bh69hcfh589h689h545h69hdfhc9h65ch56h569hbdh5b8h5c6h694h559hf6h76h577h5d7h77h694h98h8fh566q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpvqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-864c6579d5-v5vdm_openstack(609b419f-cc52-4fef-aa49-f64cdbba6755): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:11 crc kubenswrapper[4773]: E0120 18:49:11.868025 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-864c6579d5-v5vdm" podUID="609b419f-cc52-4fef-aa49-f64cdbba6755" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.718791 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.723833 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.872910 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873015 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873039 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873085 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873161 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873196 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873245 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.883606 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts" (OuterVolumeSpecName: "scripts") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888008 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888117 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888365 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs" (OuterVolumeSpecName: "logs") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data" (OuterVolumeSpecName: "config-data") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.890697 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts" (OuterVolumeSpecName: "scripts") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.890730 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr" (OuterVolumeSpecName: "kube-api-access-rhtrr") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "kube-api-access-rhtrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.891410 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.891762 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99" (OuterVolumeSpecName: "kube-api-access-xzp99") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "kube-api-access-xzp99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.910252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.929706 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data" (OuterVolumeSpecName: "config-data") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975188 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975233 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975246 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975259 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975274 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975284 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975294 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975302 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975309 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975317 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975324 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:15 crc kubenswrapper[4773]: E0120 18:49:15.077170 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Jan 20 18:49:15 crc kubenswrapper[4773]: E0120 18:49:15.077320 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfh67h666h5c6hch579hcdhch84h65ch5d9h77h668h67fh66h7fh55hc9h56fh546h78h595h59fh569hfbh5ch65fh5cdh5d5h649h66bh555q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mrxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(077faa57-a75d-4f1a-b01e-3fc69ddb5761): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.152234 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerDied","Data":"a72d0b517337335c8270dbc86cc90184c82c27aa6edf416c70a51469398c8f1b"} Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.152502 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a72d0b517337335c8270dbc86cc90184c82c27aa6edf416c70a51469398c8f1b" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.152288 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.153590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbff4dff5-7w2k2" event={"ID":"2c3949f0-4faa-4935-8d0f-7ce69d8de08d","Type":"ContainerDied","Data":"3096768a065586cfd8bc32ac4cab442fcebf027a669b7a4ad518113982f3c163"} Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.153682 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.232790 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.241448 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.458033 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3949f0-4faa-4935-8d0f-7ce69d8de08d" path="/var/lib/kubelet/pods/2c3949f0-4faa-4935-8d0f-7ce69d8de08d/volumes" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.790952 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.799885 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.906842 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:49:15 crc kubenswrapper[4773]: E0120 18:49:15.907173 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerName="keystone-bootstrap" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907185 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerName="keystone-bootstrap" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907387 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerName="keystone-bootstrap" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907870 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907956 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.929457 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-24qqg" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.930500 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.930597 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.930974 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.931232 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993122 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993210 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993273 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094619 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094752 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099846 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.110579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.113440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.244242 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.244696 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdgkq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-22fkv_openstack(3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.246139 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-22fkv" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.262964 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.286602 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398131 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398249 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398384 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398517 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs" (OuterVolumeSpecName: "logs") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.399093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data" (OuterVolumeSpecName: "config-data") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.399195 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.399644 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400074 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts" (OuterVolumeSpecName: "scripts") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400695 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400726 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400738 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.402076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.407337 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk" (OuterVolumeSpecName: "kube-api-access-tpvqk") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "kube-api-access-tpvqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.501955 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.502001 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.677859 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.678018 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kjprd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-z8p6p_openstack(d9eee838-721f-48cc-a5aa-37644a62d846): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.679352 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-z8p6p" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.685321 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.806802 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807549 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807578 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.811262 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts" (OuterVolumeSpecName: "kube-api-access-g5gts") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "kube-api-access-g5gts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.849980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.852886 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.856380 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config" (OuterVolumeSpecName: "config") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.858484 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911222 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911258 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911268 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911278 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911287 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.145591 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68fb89f56b-287lx"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.170504 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:49:17 crc kubenswrapper[4773]: W0120 18:49:17.180856 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49df8cea_026f_497b_baae_a6a09452aa3d.slice/crio-45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a WatchSource:0}: Error finding container 45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a: Status 404 returned error can't find the container with id 45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.188161 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fb89f56b-287lx" event={"ID":"cd9ba14c-8dca-4170-841c-6f5d5fa2b220","Type":"ContainerStarted","Data":"e7763efb571d5026932f6475db9eef517adc622a0b2069ed44b92f35a03ab807"} Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.197718 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerDied","Data":"4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee"} Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.197765 4773 scope.go:117] "RemoveContainer" containerID="1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.197879 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.202655 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.207784 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864c6579d5-v5vdm" event={"ID":"609b419f-cc52-4fef-aa49-f64cdbba6755","Type":"ContainerDied","Data":"6a01b8d18b1bc750f52b0b014094ed52c18500c651ec5c625261c7be00925ccf"} Jan 20 18:49:17 crc kubenswrapper[4773]: E0120 18:49:17.224013 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-22fkv" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" Jan 20 18:49:17 crc kubenswrapper[4773]: E0120 18:49:17.225053 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-z8p6p" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.317093 4773 scope.go:117] "RemoveContainer" containerID="526b5b595c5084d4d46dcd86dc9c0555e27f3a5f70cfb4f516507eaf64968118" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.393693 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.407494 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.419093 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.428235 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.466460 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" path="/var/lib/kubelet/pods/03043146-8a8f-465e-b8c2-ca01d39cc070/volumes" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.467393 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609b419f-cc52-4fef-aa49-f64cdbba6755" path="/var/lib/kubelet/pods/609b419f-cc52-4fef-aa49-f64cdbba6755/volumes" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.467906 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" path="/var/lib/kubelet/pods/d326b299-f619-4c76-9a10-045d77fa9bae/volumes" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.533103 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.210420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerStarted","Data":"3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.214024 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.215795 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerStarted","Data":"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.215824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerStarted","Data":"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.215833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerStarted","Data":"45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.218364 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fb89f56b-287lx" event={"ID":"cd9ba14c-8dca-4170-841c-6f5d5fa2b220","Type":"ContainerStarted","Data":"1945832064b5826b2347bb04240fd99ca07f116eef7af8940627839637ffed49"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.218401 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fb89f56b-287lx" event={"ID":"cd9ba14c-8dca-4170-841c-6f5d5fa2b220","Type":"ContainerStarted","Data":"c5fbc0607d6ceecd0698941eec371e01c10343f0285f2f65cacf67e32405f538"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.227592 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-49f25" podStartSLOduration=3.821026511 podStartE2EDuration="31.227574184s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:48.789458367 +0000 UTC m=+1121.711271391" lastFinishedPulling="2026-01-20 18:49:16.19600604 +0000 UTC m=+1149.117819064" observedRunningTime="2026-01-20 18:49:18.22366557 +0000 UTC m=+1151.145478594" watchObservedRunningTime="2026-01-20 18:49:18.227574184 +0000 UTC m=+1151.149387208" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.228557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerStarted","Data":"dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.228591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerStarted","Data":"75f518016f47bd1abdc74af37fefb093e1d470ea4451d239b9c0799a72e57c8d"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerStarted","Data":"2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerStarted","Data":"25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235422 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-586bf65fdf-tqctk" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" containerID="cri-o://25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2" gracePeriod=30 Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235439 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-586bf65fdf-tqctk" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" containerID="cri-o://2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27" gracePeriod=30 Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.252618 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68fb89f56b-287lx" podStartSLOduration=22.252597692 podStartE2EDuration="22.252597692s" podCreationTimestamp="2026-01-20 18:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:18.240737549 +0000 UTC m=+1151.162550593" watchObservedRunningTime="2026-01-20 18:49:18.252597692 +0000 UTC m=+1151.174410726" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.264572 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9b66d8476-cqhrd" podStartSLOduration=22.264551998 podStartE2EDuration="22.264551998s" podCreationTimestamp="2026-01-20 18:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:18.262428627 +0000 UTC m=+1151.184241651" watchObservedRunningTime="2026-01-20 18:49:18.264551998 +0000 UTC m=+1151.186365022" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.285774 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mhdc2" podStartSLOduration=3.285755095 podStartE2EDuration="3.285755095s" podCreationTimestamp="2026-01-20 18:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:18.281142584 +0000 UTC m=+1151.202955628" watchObservedRunningTime="2026-01-20 18:49:18.285755095 +0000 UTC m=+1151.207568129" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.336157 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-586bf65fdf-tqctk" podStartSLOduration=3.6725399899999998 podStartE2EDuration="31.33613374s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:49.002722065 +0000 UTC m=+1121.924535089" lastFinishedPulling="2026-01-20 18:49:16.666315815 +0000 UTC m=+1149.588128839" observedRunningTime="2026-01-20 18:49:18.298230424 +0000 UTC m=+1151.220043468" watchObservedRunningTime="2026-01-20 18:49:18.33613374 +0000 UTC m=+1151.257946774" Jan 20 18:49:19 crc kubenswrapper[4773]: I0120 18:49:19.257848 4773 generic.go:334] "Generic (PLEG): container finished" podID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerID="bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374" exitCode=0 Jan 20 18:49:19 crc kubenswrapper[4773]: I0120 18:49:19.257968 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerDied","Data":"bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374"} Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.623579 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.788174 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.788410 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.788493 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.793450 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw" (OuterVolumeSpecName: "kube-api-access-b66lw") pod "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" (UID: "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a"). InnerVolumeSpecName "kube-api-access-b66lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.821359 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" (UID: "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.839350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config" (OuterVolumeSpecName: "config") pod "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" (UID: "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.890650 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.890687 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.890698 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.279425 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerDied","Data":"f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c"} Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.279671 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.279709 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.348175 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.538833 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:21 crc kubenswrapper[4773]: E0120 18:49:21.539467 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="init" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539491 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="init" Jan 20 18:49:21 crc kubenswrapper[4773]: E0120 18:49:21.539529 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerName="neutron-db-sync" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539539 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerName="neutron-db-sync" Jan 20 18:49:21 crc kubenswrapper[4773]: E0120 18:49:21.539562 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539571 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539766 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerName="neutron-db-sync" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539792 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.540868 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.554974 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.676058 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.684520 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690050 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690392 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690549 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690681 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7hwmj" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690797 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703336 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703432 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.804958 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805074 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805139 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805158 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805191 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805217 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805240 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805277 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805320 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.807339 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.807378 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.807788 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.808490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.842232 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.877422 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907265 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907300 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907378 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.911663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.911690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.916453 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.924005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.933619 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:22 crc kubenswrapper[4773]: I0120 18:49:22.009761 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:22 crc kubenswrapper[4773]: I0120 18:49:22.512873 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:22 crc kubenswrapper[4773]: I0120 18:49:22.723264 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.550398 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76cffc5d9-m6wn7"] Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.552539 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.557162 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.558955 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.577331 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cffc5d9-m6wn7"] Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.648171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj54t\" (UniqueName: \"kubernetes.io/projected/f98c94f3-5e79-4d1a-9e1f-bab68689f193-kube-api-access-xj54t\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.648503 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.648692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-httpd-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649197 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-public-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-combined-ca-bundle\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649463 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-ovndb-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649559 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-internal-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-public-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751173 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-combined-ca-bundle\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-ovndb-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-internal-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj54t\" (UniqueName: \"kubernetes.io/projected/f98c94f3-5e79-4d1a-9e1f-bab68689f193-kube-api-access-xj54t\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-httpd-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.787194 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-public-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.788256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-ovndb-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.799427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj54t\" (UniqueName: \"kubernetes.io/projected/f98c94f3-5e79-4d1a-9e1f-bab68689f193-kube-api-access-xj54t\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.799766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.821913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-httpd-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.822227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-internal-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.822825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-combined-ca-bundle\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.881571 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.321546 4773 generic.go:334] "Generic (PLEG): container finished" podID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerID="dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe" exitCode=0 Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.321608 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerDied","Data":"dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe"} Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.323101 4773 generic.go:334] "Generic (PLEG): container finished" podID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerID="3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071" exitCode=0 Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.323126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerDied","Data":"3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071"} Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.567993 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.568357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.631649 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.631694 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.261271 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.261677 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.351602 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.351721 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerDied","Data":"75f518016f47bd1abdc74af37fefb093e1d470ea4451d239b9c0799a72e57c8d"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.351767 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f518016f47bd1abdc74af37fefb093e1d470ea4451d239b9c0799a72e57c8d" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.357262 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerDied","Data":"b51bc15af24c493f96b3eab7ab99bb433e14f99c2a5d9c6b5812e8fdfec3a8c6"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.357297 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b51bc15af24c493f96b3eab7ab99bb433e14f99c2a5d9c6b5812e8fdfec3a8c6" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.357369 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.365149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerStarted","Data":"4e34f5d6513de50dbefb964db35642a2f245e6de8a45b2992d0938120feea1ea"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.369855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerStarted","Data":"de64a54b4415566f010f514c19c1ed77dea98963569b16a8bef020db9b593d9c"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.415488 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.415862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.415943 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416218 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416341 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416390 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416423 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416563 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.421051 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l" (OuterVolumeSpecName: "kube-api-access-xbz9l") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "kube-api-access-xbz9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.423465 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs" (OuterVolumeSpecName: "logs") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.426611 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.428592 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts" (OuterVolumeSpecName: "scripts") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.435675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6" (OuterVolumeSpecName: "kube-api-access-j5kl6") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "kube-api-access-j5kl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.435675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.438089 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts" (OuterVolumeSpecName: "scripts") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.503114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data" (OuterVolumeSpecName: "config-data") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.517191 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518524 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518543 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518557 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518570 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518582 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518593 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518604 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518618 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518629 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.521756 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data" (OuterVolumeSpecName: "config-data") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.542843 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.575837 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bdd8cdbd7-xhf92"] Jan 20 18:49:27 crc kubenswrapper[4773]: E0120 18:49:27.576308 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerName="keystone-bootstrap" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576326 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerName="keystone-bootstrap" Jan 20 18:49:27 crc kubenswrapper[4773]: E0120 18:49:27.576346 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerName="placement-db-sync" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576354 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerName="placement-db-sync" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576538 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerName="placement-db-sync" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576558 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerName="keystone-bootstrap" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.577267 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.584043 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-668885694d-2br7g"] Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.585757 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.588702 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.589385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.598652 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.598885 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.610588 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bdd8cdbd7-xhf92"] Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.622212 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.622244 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.629025 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-668885694d-2br7g"] Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.724807 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvg5\" (UniqueName: \"kubernetes.io/projected/03658323-86f4-42ec-b18f-163a1e7dcaed-kube-api-access-zkvg5\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725022 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-credential-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725047 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-internal-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-config-data\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725198 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-combined-ca-bundle\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-public-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-scripts\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-internal-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725550 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7bad355-1a37-4372-9751-25a39f6a3410-logs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-fernet-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-combined-ca-bundle\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-scripts\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-config-data\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725763 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-public-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725835 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr47t\" (UniqueName: \"kubernetes.io/projected/a7bad355-1a37-4372-9751-25a39f6a3410-kube-api-access-wr47t\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.735985 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cffc5d9-m6wn7"] Jan 20 18:49:27 crc kubenswrapper[4773]: W0120 18:49:27.741007 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf98c94f3_5e79_4d1a_9e1f_bab68689f193.slice/crio-22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819 WatchSource:0}: Error finding container 22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819: Status 404 returned error can't find the container with id 22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819 Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-config-data\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-public-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr47t\" (UniqueName: \"kubernetes.io/projected/a7bad355-1a37-4372-9751-25a39f6a3410-kube-api-access-wr47t\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827452 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkvg5\" (UniqueName: \"kubernetes.io/projected/03658323-86f4-42ec-b18f-163a1e7dcaed-kube-api-access-zkvg5\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-credential-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-internal-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-config-data\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-combined-ca-bundle\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-public-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827628 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-scripts\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827651 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-internal-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827667 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7bad355-1a37-4372-9751-25a39f6a3410-logs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-fernet-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827732 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-combined-ca-bundle\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-scripts\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.832501 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-scripts\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.832779 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7bad355-1a37-4372-9751-25a39f6a3410-logs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.834890 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-config-data\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.840395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-internal-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.840862 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-public-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.842051 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-fernet-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.844453 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-credential-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.844537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-combined-ca-bundle\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.845218 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-combined-ca-bundle\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.845990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-internal-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.847376 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-public-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.847417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-config-data\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.850300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-scripts\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.852560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr47t\" (UniqueName: \"kubernetes.io/projected/a7bad355-1a37-4372-9751-25a39f6a3410-kube-api-access-wr47t\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.857454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkvg5\" (UniqueName: \"kubernetes.io/projected/03658323-86f4-42ec-b18f-163a1e7dcaed-kube-api-access-zkvg5\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.983236 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.009683 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.019224 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.169557 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.169865 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388038 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cffc5d9-m6wn7" event={"ID":"f98c94f3-5e79-4d1a-9e1f-bab68689f193","Type":"ContainerStarted","Data":"603a0369132b8169aa236a7eddb23ab65c240c9205a1a46e90ac0dbc6a9b8a53"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cffc5d9-m6wn7" event={"ID":"f98c94f3-5e79-4d1a-9e1f-bab68689f193","Type":"ContainerStarted","Data":"17ebf8342cfe8d01a6fd735b5a31bd03a6a1a8d05aa665fb5a56f6d84446f1e3"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388113 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cffc5d9-m6wn7" event={"ID":"f98c94f3-5e79-4d1a-9e1f-bab68689f193","Type":"ContainerStarted","Data":"22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388801 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.392835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.405572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerStarted","Data":"e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.405619 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerStarted","Data":"60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.406517 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.412375 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76cffc5d9-m6wn7" podStartSLOduration=5.412361306 podStartE2EDuration="5.412361306s" podCreationTimestamp="2026-01-20 18:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:28.408391341 +0000 UTC m=+1161.330204365" watchObservedRunningTime="2026-01-20 18:49:28.412361306 +0000 UTC m=+1161.334174330" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.413517 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerID="37609d6f106c18914801b3d94dcdadbe56b2aa2be4306121dfd04a92610b45bb" exitCode=0 Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.413554 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerDied","Data":"37609d6f106c18914801b3d94dcdadbe56b2aa2be4306121dfd04a92610b45bb"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.437105 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86d844bb6-6q8ms" podStartSLOduration=7.437088618 podStartE2EDuration="7.437088618s" podCreationTimestamp="2026-01-20 18:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:28.434611018 +0000 UTC m=+1161.356424042" watchObservedRunningTime="2026-01-20 18:49:28.437088618 +0000 UTC m=+1161.358901642" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.681571 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bdd8cdbd7-xhf92"] Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.803694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-668885694d-2br7g"] Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480503 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480868 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerStarted","Data":"9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480888 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668885694d-2br7g" event={"ID":"a7bad355-1a37-4372-9751-25a39f6a3410","Type":"ContainerStarted","Data":"1a89ef16fbf048ea38800570a484a04f9bfb0028c397443443b40d4433afdf2c"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668885694d-2br7g" event={"ID":"a7bad355-1a37-4372-9751-25a39f6a3410","Type":"ContainerStarted","Data":"7c482d480badaefe5bfd0770daa0a78cb9725e4408eee3b70b740b72ca521b09"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.496361 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" podStartSLOduration=8.496344624 podStartE2EDuration="8.496344624s" podCreationTimestamp="2026-01-20 18:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:29.492910871 +0000 UTC m=+1162.414723925" watchObservedRunningTime="2026-01-20 18:49:29.496344624 +0000 UTC m=+1162.418157648" Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.501989 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bdd8cdbd7-xhf92" event={"ID":"03658323-86f4-42ec-b18f-163a1e7dcaed","Type":"ContainerStarted","Data":"e9ee5adb33e652dcc40a08d63ae35aa59a52df9520f3943c11d103f17bff8109"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.502053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bdd8cdbd7-xhf92" event={"ID":"03658323-86f4-42ec-b18f-163a1e7dcaed","Type":"ContainerStarted","Data":"f06a34cd2bd9bdaeabc311e9d39f9baae024a76f2e16d12427dec857969b1e46"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.502096 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.528867 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5bdd8cdbd7-xhf92" podStartSLOduration=2.528843491 podStartE2EDuration="2.528843491s" podCreationTimestamp="2026-01-20 18:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:29.517891659 +0000 UTC m=+1162.439704703" watchObservedRunningTime="2026-01-20 18:49:29.528843491 +0000 UTC m=+1162.450656525" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.512751 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668885694d-2br7g" event={"ID":"a7bad355-1a37-4372-9751-25a39f6a3410","Type":"ContainerStarted","Data":"a400ab9de6c5a28afecfc78465f9061eb58f3340608384562ff2805663806251"} Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.513426 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.513462 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.519100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerStarted","Data":"1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455"} Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.538143 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-668885694d-2br7g" podStartSLOduration=3.538123171 podStartE2EDuration="3.538123171s" podCreationTimestamp="2026-01-20 18:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:30.532560608 +0000 UTC m=+1163.454373642" watchObservedRunningTime="2026-01-20 18:49:30.538123171 +0000 UTC m=+1163.459936195" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.559349 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-22fkv" podStartSLOduration=3.175528246 podStartE2EDuration="43.559329878s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:48.808673286 +0000 UTC m=+1121.730486310" lastFinishedPulling="2026-01-20 18:49:29.192474918 +0000 UTC m=+1162.114287942" observedRunningTime="2026-01-20 18:49:30.555651509 +0000 UTC m=+1163.477464533" watchObservedRunningTime="2026-01-20 18:49:30.559329878 +0000 UTC m=+1163.481142902" Jan 20 18:49:31 crc kubenswrapper[4773]: I0120 18:49:31.530605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerStarted","Data":"afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d"} Jan 20 18:49:31 crc kubenswrapper[4773]: I0120 18:49:31.549527 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-z8p6p" podStartSLOduration=1.98366716 podStartE2EDuration="44.549508552s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:48.702565819 +0000 UTC m=+1121.624378833" lastFinishedPulling="2026-01-20 18:49:31.268407211 +0000 UTC m=+1164.190220225" observedRunningTime="2026-01-20 18:49:31.547306659 +0000 UTC m=+1164.469119703" watchObservedRunningTime="2026-01-20 18:49:31.549508552 +0000 UTC m=+1164.471321576" Jan 20 18:49:36 crc kubenswrapper[4773]: E0120 18:49:36.295274 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.569547 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.575202 4773 generic.go:334] "Generic (PLEG): container finished" podID="d9eee838-721f-48cc-a5aa-37644a62d846" containerID="afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d" exitCode=0 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.575478 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerDied","Data":"afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d"} Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.578665 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f"} Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.579121 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.579010 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" containerID="cri-o://fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" gracePeriod=30 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.579045 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" containerID="cri-o://3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" gracePeriod=30 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.578917 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" containerID="cri-o://2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" gracePeriod=30 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.633296 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68fb89f56b-287lx" podUID="cd9ba14c-8dca-4170-841c-6f5d5fa2b220" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.881125 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.927847 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.928103 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" containerID="cri-o://5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d" gracePeriod=10 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599324 4773 generic.go:334] "Generic (PLEG): container finished" podID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" exitCode=0 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599631 4773 generic.go:334] "Generic (PLEG): container finished" podID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" exitCode=2 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599685 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599713 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602651 4773 generic.go:334] "Generic (PLEG): container finished" podID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerID="5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d" exitCode=0 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerDied","Data":"5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602793 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerDied","Data":"454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602814 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.605485 4773 generic.go:334] "Generic (PLEG): container finished" podID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerID="1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455" exitCode=0 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.605640 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerDied","Data":"1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.608506 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731655 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731946 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731993 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.732012 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.740910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2" (OuterVolumeSpecName: "kube-api-access-48nn2") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "kube-api-access-48nn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.784164 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.808602 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.837686 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config" (OuterVolumeSpecName: "config") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.850587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: W0120 18:49:37.851306 4773 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f667f5ef-cefc-40c0-a282-5d502cd45cd2/volumes/kubernetes.io~configmap/config Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.851324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config" (OuterVolumeSpecName: "config") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857321 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857356 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857369 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857380 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.860870 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.928696 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.958999 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"d9eee838-721f-48cc-a5aa-37644a62d846\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.959189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"d9eee838-721f-48cc-a5aa-37644a62d846\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.959309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"d9eee838-721f-48cc-a5aa-37644a62d846\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.959743 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.963393 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd" (OuterVolumeSpecName: "kube-api-access-kjprd") pod "d9eee838-721f-48cc-a5aa-37644a62d846" (UID: "d9eee838-721f-48cc-a5aa-37644a62d846"). InnerVolumeSpecName "kube-api-access-kjprd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.963715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d9eee838-721f-48cc-a5aa-37644a62d846" (UID: "d9eee838-721f-48cc-a5aa-37644a62d846"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.984142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9eee838-721f-48cc-a5aa-37644a62d846" (UID: "d9eee838-721f-48cc-a5aa-37644a62d846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.062031 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.062272 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.062332 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.622995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerDied","Data":"aef79cffc84cc91679ae3ca7b132c652022d5b062e4c613e53887b19256d63c1"} Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.623087 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef79cffc84cc91679ae3ca7b132c652022d5b062e4c613e53887b19256d63c1" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.623308 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.623356 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.678696 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.698755 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776391 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69f4d99ff7-gmlhl"] Jan 20 18:49:38 crc kubenswrapper[4773]: E0120 18:49:38.776749 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="init" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776766 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="init" Jan 20 18:49:38 crc kubenswrapper[4773]: E0120 18:49:38.776780 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" containerName="barbican-db-sync" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776787 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" containerName="barbican-db-sync" Jan 20 18:49:38 crc kubenswrapper[4773]: E0120 18:49:38.776805 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776812 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.777003 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" containerName="barbican-db-sync" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.777024 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.777865 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.780536 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rvh6j" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.780700 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.780810 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.797616 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69f4d99ff7-gmlhl"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.815754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7854d7cd94-r9cm7"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.817258 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.821441 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl89h\" (UniqueName: \"kubernetes.io/projected/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-kube-api-access-kl89h\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jxs\" (UniqueName: \"kubernetes.io/projected/8839acb4-5db9-4b47-a075-8798d8a01c6b-kube-api-access-46jxs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879820 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-logs\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879882 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8839acb4-5db9-4b47-a075-8798d8a01c6b-logs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-combined-ca-bundle\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-combined-ca-bundle\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880096 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data-custom\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880165 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data-custom\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.917635 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7854d7cd94-r9cm7"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.953992 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.955785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.972011 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.981950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl89h\" (UniqueName: \"kubernetes.io/projected/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-kube-api-access-kl89h\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jxs\" (UniqueName: \"kubernetes.io/projected/8839acb4-5db9-4b47-a075-8798d8a01c6b-kube-api-access-46jxs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-logs\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8839acb4-5db9-4b47-a075-8798d8a01c6b-logs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-combined-ca-bundle\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982274 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982317 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-combined-ca-bundle\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982365 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data-custom\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982431 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data-custom\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982516 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982608 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.984221 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-logs\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.985271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8839acb4-5db9-4b47-a075-8798d8a01c6b-logs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.989628 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.999457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.000053 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.000403 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.000795 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jxs\" (UniqueName: \"kubernetes.io/projected/8839acb4-5db9-4b47-a075-8798d8a01c6b-kube-api-access-46jxs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.002047 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-combined-ca-bundle\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.002296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-combined-ca-bundle\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.002569 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.005238 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl89h\" (UniqueName: \"kubernetes.io/projected/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-kube-api-access-kl89h\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.005831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.005958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data-custom\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.003575 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data-custom\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.086987 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087304 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087460 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087559 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087637 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087873 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.088052 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.088900 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.089506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.090156 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.095373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.105303 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.112405 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.198986 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199107 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199295 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.207134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.215664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.220961 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.231162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.242529 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.250611 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.273854 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.286830 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.289184 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.299990 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300379 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.301053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.311127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.322086 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts" (OuterVolumeSpecName: "scripts") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.332170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq" (OuterVolumeSpecName: "kube-api-access-tdgkq") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "kube-api-access-tdgkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.353003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.389248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data" (OuterVolumeSpecName: "config-data") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411601 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411643 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411853 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411865 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411875 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411886 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.475254 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" path="/var/lib/kubelet/pods/f667f5ef-cefc-40c0-a282-5d502cd45cd2/volumes" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.645893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerDied","Data":"2518423b901342e321aef89bcd042955aa2c984bc05c23caf50e31eea1f65a99"} Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.646235 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2518423b901342e321aef89bcd042955aa2c984bc05c23caf50e31eea1f65a99" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.646156 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.892390 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7854d7cd94-r9cm7"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.939989 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.967045 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69f4d99ff7-gmlhl"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.989825 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:39 crc kubenswrapper[4773]: E0120 18:49:39.990316 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerName="cinder-db-sync" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.990356 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerName="cinder-db-sync" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.990570 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerName="cinder-db-sync" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.991705 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.998924 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.000732 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.001863 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fjv2n" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.002130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.022219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023208 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023235 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023269 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023454 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.038912 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.058733 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.114736 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.116127 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136177 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136377 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.146086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.149887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.157984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.173263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.173332 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.173745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.201981 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.203256 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.205327 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.217653 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238571 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238697 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238724 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238761 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238876 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238970 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.339005 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341098 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341190 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341221 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341336 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341351 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341398 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.342027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.342563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.342896 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.343559 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.343889 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.344101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.358414 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.371300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.387358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.387971 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.393652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.410733 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.560214 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.586338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.671969 4773 generic.go:334] "Generic (PLEG): container finished" podID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerID="401b96609dc8783cb66cb086019d1922425f49f14cc9bffc456fd7c95238a026" exitCode=0 Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.672395 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" event={"ID":"994b4766-5a87-44ae-b271-f16a3be4fda0","Type":"ContainerDied","Data":"401b96609dc8783cb66cb086019d1922425f49f14cc9bffc456fd7c95238a026"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.672424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" event={"ID":"994b4766-5a87-44ae-b271-f16a3be4fda0","Type":"ContainerStarted","Data":"5aba09a2ca6f3a6f60e7c18e4f45334529029f8eca1484896a33d9404a7b35ce"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.678217 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" event={"ID":"8839acb4-5db9-4b47-a075-8798d8a01c6b","Type":"ContainerStarted","Data":"f52ae7a3134779398162e2cc2833f6ac6a66739dae8079844ccb344871f07b8a"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.679544 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" event={"ID":"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782","Type":"ContainerStarted","Data":"6bf46e026e5abc132b665eb31417655ab9af52f70306a4f98061d935daaa2c93"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.713798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerStarted","Data":"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.713842 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerStarted","Data":"1ec65c4d727460ec393332a2c46d1ecf2c32fcbc331c29099e6ae31f70e9f10d"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.714950 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.714974 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.747232 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d8b6458bb-fc8lw" podStartSLOduration=2.747210794 podStartE2EDuration="2.747210794s" podCreationTimestamp="2026-01-20 18:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:40.738471725 +0000 UTC m=+1173.660284759" watchObservedRunningTime="2026-01-20 18:49:40.747210794 +0000 UTC m=+1173.669023828" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.883106 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: W0120 18:49:40.904158 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfe8dc2b_eac6_4606_9b32_848e3a273eef.slice/crio-b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133 WatchSource:0}: Error finding container b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133: Status 404 returned error can't find the container with id b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133 Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.352014 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.399272 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.488649 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.500073 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577668 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577744 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577779 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577878 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578030 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578057 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578116 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578170 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578196 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.580809 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.582646 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd" (OuterVolumeSpecName: "kube-api-access-9mrxd") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "kube-api-access-9mrxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.584411 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq" (OuterVolumeSpecName: "kube-api-access-4vlwq") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "kube-api-access-4vlwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.587142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts" (OuterVolumeSpecName: "scripts") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.604576 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config" (OuterVolumeSpecName: "config") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.605399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.607572 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.610301 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.612653 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.634448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.656922 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data" (OuterVolumeSpecName: "config-data") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.680682 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.681774 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.681904 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682003 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682071 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682133 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682193 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682281 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682347 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682415 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682471 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682550 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.723948 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerStarted","Data":"b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.726396 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerStarted","Data":"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.728831 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.728824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" event={"ID":"994b4766-5a87-44ae-b271-f16a3be4fda0","Type":"ContainerDied","Data":"5aba09a2ca6f3a6f60e7c18e4f45334529029f8eca1484896a33d9404a7b35ce"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.729009 4773 scope.go:117] "RemoveContainer" containerID="401b96609dc8783cb66cb086019d1922425f49f14cc9bffc456fd7c95238a026" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731439 4773 generic.go:334] "Generic (PLEG): container finished" podID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" exitCode=0 Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731592 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"d0e8ddb6dbdcbfbf1e26c6a891d80cf5f965501af473ae07fda9dcc295cac646"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731516 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.733394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerStarted","Data":"61e0e72231f57915e14eb84dde43eba2a10986a7eb1f548177c2131ae5e71eff"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.830911 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.853671 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876104 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876526 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876545 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876576 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876583 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876598 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerName="init" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876604 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerName="init" Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876621 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876627 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876804 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876820 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876827 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerName="init" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876840 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.880678 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.886157 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.886429 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.889745 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.897162 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.903681 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989189 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989385 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989555 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989685 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092239 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092449 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.093877 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.093911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.098512 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.101385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.105792 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.109631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.114521 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.124321 4773 scope.go:117] "RemoveContainer" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.202855 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.627143 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.743814 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerStarted","Data":"7bc4da783a4581936a2d2574fb4406eecf3d9e58a1b1e68a3dbf5ed59a34cc62"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.124630 4773 scope.go:117] "RemoveContainer" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.348195 4773 scope.go:117] "RemoveContainer" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.418789 4773 scope.go:117] "RemoveContainer" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" Jan 20 18:49:43 crc kubenswrapper[4773]: E0120 18:49:43.419478 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f\": container with ID starting with 3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f not found: ID does not exist" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419523 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f"} err="failed to get container status \"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f\": rpc error: code = NotFound desc = could not find container \"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f\": container with ID starting with 3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f not found: ID does not exist" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419549 4773 scope.go:117] "RemoveContainer" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" Jan 20 18:49:43 crc kubenswrapper[4773]: E0120 18:49:43.419847 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21\": container with ID starting with fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21 not found: ID does not exist" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419880 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21"} err="failed to get container status \"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21\": rpc error: code = NotFound desc = could not find container \"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21\": container with ID starting with fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21 not found: ID does not exist" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419902 4773 scope.go:117] "RemoveContainer" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" Jan 20 18:49:43 crc kubenswrapper[4773]: E0120 18:49:43.422103 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6\": container with ID starting with 2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6 not found: ID does not exist" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.422128 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6"} err="failed to get container status \"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6\": rpc error: code = NotFound desc = could not find container \"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6\": container with ID starting with 2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6 not found: ID does not exist" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.459703 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" path="/var/lib/kubelet/pods/077faa57-a75d-4f1a-b01e-3fc69ddb5761/volumes" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.460548 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" path="/var/lib/kubelet/pods/994b4766-5a87-44ae-b271-f16a3be4fda0/volumes" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.593168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:43 crc kubenswrapper[4773]: W0120 18:49:43.614178 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c WatchSource:0}: Error finding container 88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c: Status 404 returned error can't find the container with id 88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.766233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.772966 4773 generic.go:334] "Generic (PLEG): container finished" podID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerID="5b0075b2e498f8ecabb538d3ba4a3dfe5c7da84ee028f9b1a8729de23849abd7" exitCode=0 Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.773026 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerDied","Data":"5b0075b2e498f8ecabb538d3ba4a3dfe5c7da84ee028f9b1a8729de23849abd7"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.781264 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" event={"ID":"8839acb4-5db9-4b47-a075-8798d8a01c6b","Type":"ContainerStarted","Data":"0751dc63fe825a3449a0eceb64da4673614dadccbb05d1a7ccd9706d5c0fd15f"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.781312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" event={"ID":"8839acb4-5db9-4b47-a075-8798d8a01c6b","Type":"ContainerStarted","Data":"01410284b151c46472fe04706a119e67d310c83c3c265ae4de03940b556a1357"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.794705 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" event={"ID":"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782","Type":"ContainerStarted","Data":"ce9ee5db281cb21f70d3c2098c9e982b9c7d1d729753eb35415a8a3ebb63c256"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.794758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" event={"ID":"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782","Type":"ContainerStarted","Data":"7872eba03b0aed7130a58d220a89f78ab5744de8fc73b616bc3a4d5cca5c396d"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.821071 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" podStartSLOduration=2.674613618 podStartE2EDuration="5.821051787s" podCreationTimestamp="2026-01-20 18:49:38 +0000 UTC" firstStartedPulling="2026-01-20 18:49:39.978544035 +0000 UTC m=+1172.900357059" lastFinishedPulling="2026-01-20 18:49:43.124982204 +0000 UTC m=+1176.046795228" observedRunningTime="2026-01-20 18:49:43.810505545 +0000 UTC m=+1176.732318569" watchObservedRunningTime="2026-01-20 18:49:43.821051787 +0000 UTC m=+1176.742864811" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.841610 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" podStartSLOduration=2.610208797 podStartE2EDuration="5.841590288s" podCreationTimestamp="2026-01-20 18:49:38 +0000 UTC" firstStartedPulling="2026-01-20 18:49:39.907528587 +0000 UTC m=+1172.829341611" lastFinishedPulling="2026-01-20 18:49:43.138910078 +0000 UTC m=+1176.060723102" observedRunningTime="2026-01-20 18:49:43.830655327 +0000 UTC m=+1176.752468361" watchObservedRunningTime="2026-01-20 18:49:43.841590288 +0000 UTC m=+1176.763403312" Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.806261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.808943 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerStarted","Data":"3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.808986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerStarted","Data":"5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.810302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerStarted","Data":"68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.812088 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerStarted","Data":"9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.813364 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.832404 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" podStartSLOduration=4.832380498 podStartE2EDuration="4.832380498s" podCreationTimestamp="2026-01-20 18:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:44.829594711 +0000 UTC m=+1177.751407745" watchObservedRunningTime="2026-01-20 18:49:44.832380498 +0000 UTC m=+1177.754193522" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.845250 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerStarted","Data":"2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b"} Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.856818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e"} Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.856974 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" containerID="cri-o://5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4" gracePeriod=30 Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.857088 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.857133 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" containerID="cri-o://3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c" gracePeriod=30 Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.878739 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7574cb8f94-wwkgd"] Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.886579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.891048 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.891691 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.894725 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.453429887 podStartE2EDuration="6.894707007s" podCreationTimestamp="2026-01-20 18:49:39 +0000 UTC" firstStartedPulling="2026-01-20 18:49:40.906907712 +0000 UTC m=+1173.828720736" lastFinishedPulling="2026-01-20 18:49:43.348184832 +0000 UTC m=+1176.269997856" observedRunningTime="2026-01-20 18:49:45.875802715 +0000 UTC m=+1178.797615749" watchObservedRunningTime="2026-01-20 18:49:45.894707007 +0000 UTC m=+1178.816520031" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.923450 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.923424164 podStartE2EDuration="5.923424164s" podCreationTimestamp="2026-01-20 18:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:45.895385854 +0000 UTC m=+1178.817198878" watchObservedRunningTime="2026-01-20 18:49:45.923424164 +0000 UTC m=+1178.845237188" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.928073 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7574cb8f94-wwkgd"] Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dmfd\" (UniqueName: \"kubernetes.io/projected/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-kube-api-access-7dmfd\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982500 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-internal-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982571 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-logs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-combined-ca-bundle\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982661 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-public-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data-custom\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982718 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084682 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data-custom\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084746 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dmfd\" (UniqueName: \"kubernetes.io/projected/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-kube-api-access-7dmfd\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084896 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-internal-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084964 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-logs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.085004 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-combined-ca-bundle\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.085032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-public-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.085922 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-logs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.090337 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data-custom\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.091122 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-internal-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.091470 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-combined-ca-bundle\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.094652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-public-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.095631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.110538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dmfd\" (UniqueName: \"kubernetes.io/projected/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-kube-api-access-7dmfd\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.206490 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.685168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7574cb8f94-wwkgd"] Jan 20 18:49:46 crc kubenswrapper[4773]: W0120 18:49:46.688314 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod436dcd32_51a0_4a9e_8a0a_fb852a5de1f0.slice/crio-74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd WatchSource:0}: Error finding container 74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd: Status 404 returned error can't find the container with id 74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.865454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7574cb8f94-wwkgd" event={"ID":"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0","Type":"ContainerStarted","Data":"74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd"} Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868034 4773 generic.go:334] "Generic (PLEG): container finished" podID="25154706-fb3d-45e9-b041-a925b21cf99e" containerID="3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c" exitCode=0 Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868069 4773 generic.go:334] "Generic (PLEG): container finished" podID="25154706-fb3d-45e9-b041-a925b21cf99e" containerID="5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4" exitCode=143 Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerDied","Data":"3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c"} Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868123 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerDied","Data":"5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.863919 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920561 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920621 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920697 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920738 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920847 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920884 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920901 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.926326 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.926815 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.927279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerDied","Data":"7bc4da783a4581936a2d2574fb4406eecf3d9e58a1b1e68a3dbf5ed59a34cc62"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.927322 4773 scope.go:117] "RemoveContainer" containerID="3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.927720 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.928096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs" (OuterVolumeSpecName: "logs") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938178 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l" (OuterVolumeSpecName: "kube-api-access-t458l") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "kube-api-access-t458l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938349 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts" (OuterVolumeSpecName: "scripts") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7574cb8f94-wwkgd" event={"ID":"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0","Type":"ContainerStarted","Data":"15c3b05a3fb5adde12caec644eef97734f0e971ada82d6d248d8ec5391ee4505"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938463 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7574cb8f94-wwkgd" event={"ID":"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0","Type":"ContainerStarted","Data":"152e8d76edca114872034ad661c3ace516bc072a3a7ac62c230215e5836ad322"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938799 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938838 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.940805 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.969515 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7574cb8f94-wwkgd" podStartSLOduration=2.969494874 podStartE2EDuration="2.969494874s" podCreationTimestamp="2026-01-20 18:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:47.956716588 +0000 UTC m=+1180.878529602" watchObservedRunningTime="2026-01-20 18:49:47.969494874 +0000 UTC m=+1180.891307898" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.971194 4773 scope.go:117] "RemoveContainer" containerID="5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.977086 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.980883 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data" (OuterVolumeSpecName: "config-data") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029289 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029327 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029338 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029346 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029355 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029364 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.271801 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.286009 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.294954 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: E0120 18:49:48.295361 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295378 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" Jan 20 18:49:48 crc kubenswrapper[4773]: E0120 18:49:48.295402 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295410 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295584 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295603 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.296478 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.300619 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.300843 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.303883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.309559 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333351 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-logs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333422 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpw6k\" (UniqueName: \"kubernetes.io/projected/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-kube-api-access-zpw6k\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333447 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-scripts\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.334066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436097 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436171 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436224 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436249 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-logs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-logs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436802 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpw6k\" (UniqueName: \"kubernetes.io/projected/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-kube-api-access-zpw6k\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.437077 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-scripts\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.437111 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.440820 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.441753 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.441911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.443053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-scripts\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.443666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.443966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.452326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpw6k\" (UniqueName: \"kubernetes.io/projected/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-kube-api-access-zpw6k\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.656843 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.677855 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.800894 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.949968 4773 generic.go:334] "Generic (PLEG): container finished" podID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerID="2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27" exitCode=137 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.950001 4773 generic.go:334] "Generic (PLEG): container finished" podID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerID="25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2" exitCode=137 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.950037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerDied","Data":"2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.950060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerDied","Data":"25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:49.457414 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" path="/var/lib/kubelet/pods/25154706-fb3d-45e9-b041-a925b21cf99e/volumes" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.340310 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.433578 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.529270 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.563687 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.588956 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.620782 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.633672 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.684470 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.684779 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" containerID="cri-o://9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc" gracePeriod=10 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970167 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerID="9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc" exitCode=0 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerDied","Data":"9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970562 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" containerID="cri-o://7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" gracePeriod=30 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970582 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" containerID="cri-o://aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" gracePeriod=30 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.002246 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.027465 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.920627 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.927156 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.985495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.986901 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.988980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerDied","Data":"0b60f30c0a7ab6c9d2a1ba89c628d4ad9f1438793f4c8b9c55bf5b64d977ae2b"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.989030 4773 scope.go:117] "RemoveContainer" containerID="2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.989131 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.992344 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" containerID="cri-o://68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7" gracePeriod=30 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.992670 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.993012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerDied","Data":"de64a54b4415566f010f514c19c1ed77dea98963569b16a8bef020db9b593d9c"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.993072 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" containerID="cri-o://2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b" gracePeriod=30 Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.011814 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.269670754 podStartE2EDuration="11.011795053s" podCreationTimestamp="2026-01-20 18:49:41 +0000 UTC" firstStartedPulling="2026-01-20 18:49:43.618371412 +0000 UTC m=+1176.540184436" lastFinishedPulling="2026-01-20 18:49:51.360495711 +0000 UTC m=+1184.282308735" observedRunningTime="2026-01-20 18:49:52.009662943 +0000 UTC m=+1184.931475987" watchObservedRunningTime="2026-01-20 18:49:52.011795053 +0000 UTC m=+1184.933608077" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.021762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.021884 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022104 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022270 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022353 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022411 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022488 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.023634 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs" (OuterVolumeSpecName: "logs") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.037101 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc" (OuterVolumeSpecName: "kube-api-access-z7plc") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "kube-api-access-z7plc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.041753 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.042520 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4" (OuterVolumeSpecName: "kube-api-access-dpcp4") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "kube-api-access-dpcp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.042903 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.107562 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts" (OuterVolumeSpecName: "scripts") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.112630 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data" (OuterVolumeSpecName: "config-data") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147863 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147901 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147911 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147921 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147944 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147953 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.150961 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.182603 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config" (OuterVolumeSpecName: "config") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.182620 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.185351 4773 scope.go:117] "RemoveContainer" containerID="25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.185899 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.207662 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.212918 4773 scope.go:117] "RemoveContainer" containerID="9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250198 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250235 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250248 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250258 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.265109 4773 scope.go:117] "RemoveContainer" containerID="37609d6f106c18914801b3d94dcdadbe56b2aa2be4306121dfd04a92610b45bb" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.336468 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.347533 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.355768 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.370099 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.894296 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.012731 4773 generic.go:334] "Generic (PLEG): container finished" podID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerID="2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b" exitCode=0 Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.012848 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerDied","Data":"2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b"} Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.022975 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5","Type":"ContainerStarted","Data":"6ca3fa65c2bdb7d05f881279c32ca3a29ca5b5106b54c1b47b632b5dd3f883a7"} Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.472629 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" path="/var/lib/kubelet/pods/69d10de9-a03e-4020-8219-25cb3d9520a5/volumes" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.473607 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" path="/var/lib/kubelet/pods/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6/volumes" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.894101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.957899 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.958122 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86d844bb6-6q8ms" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" containerID="cri-o://e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe" gracePeriod=30 Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.958209 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86d844bb6-6q8ms" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" containerID="cri-o://60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16" gracePeriod=30 Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.048190 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5","Type":"ContainerStarted","Data":"0fbaaf3aba900c46337cdd72555a99aceefc82267b577ce70b1b81d2fc4b22c0"} Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.048229 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5","Type":"ContainerStarted","Data":"830274cb68640abc5cf8a072cf14bb3cb42b8a184d951d9f70e649ddc6f24086"} Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.048251 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.089268 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.089248844 podStartE2EDuration="6.089248844s" podCreationTimestamp="2026-01-20 18:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:54.074072491 +0000 UTC m=+1186.995885515" watchObservedRunningTime="2026-01-20 18:49:54.089248844 +0000 UTC m=+1187.011061868" Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.574226 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.625396 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.625614 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d8b6458bb-fc8lw" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" containerID="cri-o://1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" gracePeriod=30 Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.625715 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d8b6458bb-fc8lw" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" containerID="cri-o://165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" gracePeriod=30 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.057186 4773 generic.go:334] "Generic (PLEG): container finished" podID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerID="60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16" exitCode=0 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.057250 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerDied","Data":"60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16"} Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.059081 4773 generic.go:334] "Generic (PLEG): container finished" podID="49df8cea-026f-497b-baae-a6a09452aa3d" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" exitCode=0 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.059135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerDied","Data":"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4"} Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.060501 4773 generic.go:334] "Generic (PLEG): container finished" podID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" exitCode=143 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.060557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerDied","Data":"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5"} Jan 20 18:49:56 crc kubenswrapper[4773]: I0120 18:49:56.567252 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:49:56 crc kubenswrapper[4773]: I0120 18:49:56.879052 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: i/o timeout" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.083244 4773 generic.go:334] "Generic (PLEG): container finished" podID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerID="68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7" exitCode=0 Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.083300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerDied","Data":"68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7"} Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.264086 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352527 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352649 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352740 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352761 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352789 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.353078 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.354041 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.358954 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts" (OuterVolumeSpecName: "scripts") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.359445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.361115 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz" (OuterVolumeSpecName: "kube-api-access-qbtjz") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "kube-api-access-qbtjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.428064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455442 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455579 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455589 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455599 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.464055 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data" (OuterVolumeSpecName: "config-data") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.557754 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.086086 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.145427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerDied","Data":"b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133"} Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.145488 4773 scope.go:117] "RemoveContainer" containerID="2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.145729 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157362 4773 generic.go:334] "Generic (PLEG): container finished" podID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" exitCode=0 Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157407 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerDied","Data":"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645"} Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157440 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerDied","Data":"1ec65c4d727460ec393332a2c46d1ecf2c32fcbc331c29099e6ae31f70e9f10d"} Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157597 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.166877 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.166945 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.167046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.167077 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.167122 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.168754 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs" (OuterVolumeSpecName: "logs") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171199 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171239 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171276 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171918 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171999 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344" gracePeriod=600 Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.179381 4773 scope.go:117] "RemoveContainer" containerID="68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.183024 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.188081 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.189909 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.191327 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7" (OuterVolumeSpecName: "kube-api-access-h8bj7") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "kube-api-access-h8bj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.207947 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208305 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208316 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208329 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208336 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208345 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208351 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208359 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="init" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208365 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="init" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208377 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208382 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208399 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208405 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208417 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208423 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208432 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208438 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208586 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208597 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208613 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208625 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208634 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208642 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208649 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.209522 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.212089 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.235109 4773 scope.go:117] "RemoveContainer" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.248386 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.249053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data" (OuterVolumeSpecName: "config-data") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.254040 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.274096 4773 scope.go:117] "RemoveContainer" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278818 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3e6b840-22c8-4add-b022-1ba197ca588c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278877 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278918 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279027 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579s7\" (UniqueName: \"kubernetes.io/projected/e3e6b840-22c8-4add-b022-1ba197ca588c-kube-api-access-579s7\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279136 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279147 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279156 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279165 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279232 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.313658 4773 scope.go:117] "RemoveContainer" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.316284 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645\": container with ID starting with 165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645 not found: ID does not exist" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.316315 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645"} err="failed to get container status \"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645\": rpc error: code = NotFound desc = could not find container \"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645\": container with ID starting with 165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645 not found: ID does not exist" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.316334 4773 scope.go:117] "RemoveContainer" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.316628 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5\": container with ID starting with 1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5 not found: ID does not exist" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.316650 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5"} err="failed to get container status \"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5\": rpc error: code = NotFound desc = could not find container \"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5\": container with ID starting with 1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5 not found: ID does not exist" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.380996 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381091 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381135 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381199 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579s7\" (UniqueName: \"kubernetes.io/projected/e3e6b840-22c8-4add-b022-1ba197ca588c-kube-api-access-579s7\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3e6b840-22c8-4add-b022-1ba197ca588c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3e6b840-22c8-4add-b022-1ba197ca588c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.387920 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.388405 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.389211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.392366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.399846 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579s7\" (UniqueName: \"kubernetes.io/projected/e3e6b840-22c8-4add-b022-1ba197ca588c-kube-api-access-579s7\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.524109 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.530917 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.565164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.055850 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.181091 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196291 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344" exitCode=0 Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196369 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344"} Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196395 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb"} Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196411 4773 scope.go:117] "RemoveContainer" containerID="89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.204301 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3e6b840-22c8-4add-b022-1ba197ca588c","Type":"ContainerStarted","Data":"ae6d3e6d193bf50920cd851e6b7d9f5fc1799d295f41e9ca412ec695742c3a76"} Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.250351 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.459693 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" path="/var/lib/kubelet/pods/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2/volumes" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.460381 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" path="/var/lib/kubelet/pods/dfe8dc2b-eac6-4606-9b32-848e3a273eef/volumes" Jan 20 18:50:00 crc kubenswrapper[4773]: I0120 18:50:00.000924 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:50:00 crc kubenswrapper[4773]: I0120 18:50:00.238090 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3e6b840-22c8-4add-b022-1ba197ca588c","Type":"ContainerStarted","Data":"ac0034e4bd44890d2711e86b526b656e18e87e3e4243bd6d835e8690bc7448b2"} Jan 20 18:50:00 crc kubenswrapper[4773]: I0120 18:50:00.844955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.116711 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.117773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.120978 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.123470 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-kxfsr" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.123532 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.129345 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238216 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgft\" (UniqueName: \"kubernetes.io/projected/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-kube-api-access-7lgft\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.255463 4773 generic.go:334] "Generic (PLEG): container finished" podID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerID="e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe" exitCode=0 Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.255531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerDied","Data":"e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe"} Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.261251 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3e6b840-22c8-4add-b022-1ba197ca588c","Type":"ContainerStarted","Data":"f8df994e3e0d9cc59d1919154bacdb40970a242eae5691909d9c3a996738eb17"} Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.340835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.341064 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.341106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.341140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgft\" (UniqueName: \"kubernetes.io/projected/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-kube-api-access-7lgft\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.342657 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.352460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.363483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.364177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgft\" (UniqueName: \"kubernetes.io/projected/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-kube-api-access-7lgft\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.438532 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.574348 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.604114 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.6040921089999998 podStartE2EDuration="3.604092109s" podCreationTimestamp="2026-01-20 18:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:01.284674282 +0000 UTC m=+1194.206487316" watchObservedRunningTime="2026-01-20 18:50:01.604092109 +0000 UTC m=+1194.525905133" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647505 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647577 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647671 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647755 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647837 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.656412 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc" (OuterVolumeSpecName: "kube-api-access-j2hnc") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "kube-api-access-j2hnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.660639 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.700492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config" (OuterVolumeSpecName: "config") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.717031 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751752 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751785 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751799 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751818 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.755543 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.854372 4773 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.969479 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 20 18:50:01 crc kubenswrapper[4773]: W0120 18:50:01.970763 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf040c75f_a2cb_4bfe_9fd1_0105887fa6b4.slice/crio-db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2 WatchSource:0}: Error finding container db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2: Status 404 returned error can't find the container with id db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2 Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.270594 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4","Type":"ContainerStarted","Data":"db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2"} Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.273131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerDied","Data":"4e34f5d6513de50dbefb964db35642a2f245e6de8a45b2992d0938120feea1ea"} Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.273196 4773 scope.go:117] "RemoveContainer" containerID="60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16" Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.273208 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.293762 4773 scope.go:117] "RemoveContainer" containerID="e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe" Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.306747 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.314081 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:50:03 crc kubenswrapper[4773]: I0120 18:50:03.461613 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" path="/var/lib/kubelet/pods/a81115d7-0fb0-4319-9705-0fae198ad70b/volumes" Jan 20 18:50:03 crc kubenswrapper[4773]: I0120 18:50:03.565789 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 20 18:50:06 crc kubenswrapper[4773]: I0120 18:50:06.567312 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:50:07 crc kubenswrapper[4773]: I0120 18:50:07.999134 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 18:50:08 crc kubenswrapper[4773]: E0120 18:50:07.999727 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999740 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" Jan 20 18:50:08 crc kubenswrapper[4773]: E0120 18:50:07.999754 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999759 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999953 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999976 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.000496 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.015494 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.117266 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.118332 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.127899 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.188984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.189050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.224393 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.225484 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.227994 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.233638 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297449 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297469 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.298789 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.349080 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.351668 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.359741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.372601 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399551 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.400356 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.400919 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.424391 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.425682 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.428384 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.434219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.437099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.438171 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.437029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501198 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501252 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501843 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.521825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.545881 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.602661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.602713 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.608433 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.615561 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.629202 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.630608 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.632443 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.637439 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.637886 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.704215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.704344 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.715018 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.806902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.807076 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.807186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.807806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.827733 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.852771 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.982306 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:12 crc kubenswrapper[4773]: I0120 18:50:12.209101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 18:50:13 crc kubenswrapper[4773]: I0120 18:50:13.880830 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.029830 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.132694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.145297 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.152801 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.295983 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.400794 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerStarted","Data":"2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.400835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerStarted","Data":"efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.402610 4773 generic.go:334] "Generic (PLEG): container finished" podID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerID="33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3" exitCode=0 Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.402689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vphtt" event={"ID":"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7","Type":"ContainerDied","Data":"33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.402715 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vphtt" event={"ID":"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7","Type":"ContainerStarted","Data":"b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.405127 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4","Type":"ContainerStarted","Data":"2c66796b42027cddb54ebf14b42a3ad401508203b482b5599165d709fb2ff2f1"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.407777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerStarted","Data":"ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.407813 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerStarted","Data":"7ac71ed6da12d2a976b878b9fa0faeca3a2bd030069ee0d9b4520b6b973abc08"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.410131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerStarted","Data":"9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.410300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerStarted","Data":"17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.413184 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" event={"ID":"f4f47b18-303f-415d-8bf8-c1f7a075b747","Type":"ContainerStarted","Data":"9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.415125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerStarted","Data":"7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.415166 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerStarted","Data":"b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.426495 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-jgwdl" podStartSLOduration=6.426477374 podStartE2EDuration="6.426477374s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.419261902 +0000 UTC m=+1207.341074926" watchObservedRunningTime="2026-01-20 18:50:14.426477374 +0000 UTC m=+1207.348290398" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.457219 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-f2d2j" podStartSLOduration=6.457198738 podStartE2EDuration="6.457198738s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.450192771 +0000 UTC m=+1207.372005795" watchObservedRunningTime="2026-01-20 18:50:14.457198738 +0000 UTC m=+1207.379011762" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.471886 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.864086634 podStartE2EDuration="13.47186401s" podCreationTimestamp="2026-01-20 18:50:01 +0000 UTC" firstStartedPulling="2026-01-20 18:50:01.973465391 +0000 UTC m=+1194.895278425" lastFinishedPulling="2026-01-20 18:50:13.581242777 +0000 UTC m=+1206.503055801" observedRunningTime="2026-01-20 18:50:14.466067501 +0000 UTC m=+1207.387880525" watchObservedRunningTime="2026-01-20 18:50:14.47186401 +0000 UTC m=+1207.393677034" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.493749 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" podStartSLOduration=6.493732592 podStartE2EDuration="6.493732592s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.4836012 +0000 UTC m=+1207.405414234" watchObservedRunningTime="2026-01-20 18:50:14.493732592 +0000 UTC m=+1207.415545616" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.516499 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-da16-account-create-update-nsb2n" podStartSLOduration=6.516480127 podStartE2EDuration="6.516480127s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.506168399 +0000 UTC m=+1207.427981443" watchObservedRunningTime="2026-01-20 18:50:14.516480127 +0000 UTC m=+1207.438293151" Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.426467 4773 generic.go:334] "Generic (PLEG): container finished" podID="7455911e-a1ad-442b-97b9-362496066bbf" containerID="ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.426571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerDied","Data":"ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.437342 4773 generic.go:334] "Generic (PLEG): container finished" podID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerID="9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.437407 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerDied","Data":"9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.439587 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerID="6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.439626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" event={"ID":"f4f47b18-303f-415d-8bf8-c1f7a075b747","Type":"ContainerDied","Data":"6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.441522 4773 generic.go:334] "Generic (PLEG): container finished" podID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerID="7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.441561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerDied","Data":"7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.451335 4773 generic.go:334] "Generic (PLEG): container finished" podID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerID="2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.464080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerDied","Data":"2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.553381 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.553690 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" containerID="cri-o://4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.554227 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" containerID="cri-o://cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.554332 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" containerID="cri-o://2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.554402 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" containerID="cri-o://7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.841095 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.039490 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.039570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.040726 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" (UID: "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.045046 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm" (OuterVolumeSpecName: "kube-api-access-8kmxm") pod "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" (UID: "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7"). InnerVolumeSpecName "kube-api-access-8kmxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.142645 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.142683 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.462900 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" exitCode=0 Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463178 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" exitCode=2 Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463186 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" exitCode=0 Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463221 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463275 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.465187 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.466060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vphtt" event={"ID":"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7","Type":"ContainerDied","Data":"b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.466119 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.567747 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.567907 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.103860 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.272609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"833eac91-4269-4e1e-9923-8dd8ed2276dc\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.272901 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"833eac91-4269-4e1e-9923-8dd8ed2276dc\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.273498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "833eac91-4269-4e1e-9923-8dd8ed2276dc" (UID: "833eac91-4269-4e1e-9923-8dd8ed2276dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.274487 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.285997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl" (OuterVolumeSpecName: "kube-api-access-clxwl") pod "833eac91-4269-4e1e-9923-8dd8ed2276dc" (UID: "833eac91-4269-4e1e-9923-8dd8ed2276dc"). InnerVolumeSpecName "kube-api-access-clxwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.316895 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.325036 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.330306 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.376033 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.377084 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.476842 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"2bd3a449-dc14-46ca-8e19-64d0a282483e\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.476885 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"f4f47b18-303f-415d-8bf8-c1f7a075b747\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.476946 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"f4f47b18-303f-415d-8bf8-c1f7a075b747\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477047 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"2bd3a449-dc14-46ca-8e19-64d0a282483e\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477072 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477741 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bd3a449-dc14-46ca-8e19-64d0a282483e" (UID: "2bd3a449-dc14-46ca-8e19-64d0a282483e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.478254 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4f47b18-303f-415d-8bf8-c1f7a075b747" (UID: "f4f47b18-303f-415d-8bf8-c1f7a075b747"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.478673 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47dcb7c9-ffa7-46bc-b695-02aea6e679a1" (UID: "47dcb7c9-ffa7-46bc-b695-02aea6e679a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.483732 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs" (OuterVolumeSpecName: "kube-api-access-42xgs") pod "47dcb7c9-ffa7-46bc-b695-02aea6e679a1" (UID: "47dcb7c9-ffa7-46bc-b695-02aea6e679a1"). InnerVolumeSpecName "kube-api-access-42xgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.490193 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78" (OuterVolumeSpecName: "kube-api-access-w8t78") pod "f4f47b18-303f-415d-8bf8-c1f7a075b747" (UID: "f4f47b18-303f-415d-8bf8-c1f7a075b747"). InnerVolumeSpecName "kube-api-access-w8t78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.496350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv" (OuterVolumeSpecName: "kube-api-access-5m2hv") pod "2bd3a449-dc14-46ca-8e19-64d0a282483e" (UID: "2bd3a449-dc14-46ca-8e19-64d0a282483e"). InnerVolumeSpecName "kube-api-access-5m2hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.497542 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.500529 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerDied","Data":"17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.500571 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.504666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerDied","Data":"b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.504705 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.504793 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.508201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" event={"ID":"f4f47b18-303f-415d-8bf8-c1f7a075b747","Type":"ContainerDied","Data":"9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.508315 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.508430 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.510309 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.510487 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerDied","Data":"efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.510643 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.514986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerDied","Data":"7ac71ed6da12d2a976b878b9fa0faeca3a2bd030069ee0d9b4520b6b973abc08"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.515055 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac71ed6da12d2a976b878b9fa0faeca3a2bd030069ee0d9b4520b6b973abc08" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.515180 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.579593 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"7455911e-a1ad-442b-97b9-362496066bbf\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580028 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"7455911e-a1ad-442b-97b9-362496066bbf\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580480 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580557 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580620 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580698 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580895 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.581033 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.581834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7455911e-a1ad-442b-97b9-362496066bbf" (UID: "7455911e-a1ad-442b-97b9-362496066bbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.587565 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t" (OuterVolumeSpecName: "kube-api-access-4fp7t") pod "7455911e-a1ad-442b-97b9-362496066bbf" (UID: "7455911e-a1ad-442b-97b9-362496066bbf"). InnerVolumeSpecName "kube-api-access-4fp7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.682006 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.682349 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.918722 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.986147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.986212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.986611 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987035 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987071 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987187 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987924 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.988151 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.995892 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts" (OuterVolumeSpecName: "scripts") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.996134 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr" (OuterVolumeSpecName: "kube-api-access-ckpjr") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "kube-api-access-ckpjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.058486 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099219 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099251 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099261 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099273 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.140068 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data" (OuterVolumeSpecName: "config-data") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.142773 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.202235 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.202277 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526058 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" exitCode=0 Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e"} Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c"} Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526139 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526156 4773 scope.go:117] "RemoveContainer" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.557190 4773 scope.go:117] "RemoveContainer" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.558429 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.570203 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.577845 4773 scope.go:117] "RemoveContainer" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.584398 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.584976 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.585078 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.585149 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.585210 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.585289 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.585351 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.585543 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7455911e-a1ad-442b-97b9-362496066bbf" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587383 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7455911e-a1ad-442b-97b9-362496066bbf" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587504 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587576 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587632 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587691 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587760 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587813 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587883 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587962 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.588017 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.588068 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.588119 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.588168 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.588494 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593120 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593213 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593249 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593265 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7455911e-a1ad-442b-97b9-362496066bbf" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593282 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593290 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593321 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593337 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593351 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.595895 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.597373 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.598814 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.599112 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.616700 4773 scope.go:117] "RemoveContainer" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.645566 4773 scope.go:117] "RemoveContainer" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.645984 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1\": container with ID starting with cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1 not found: ID does not exist" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646012 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1"} err="failed to get container status \"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1\": rpc error: code = NotFound desc = could not find container \"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1\": container with ID starting with cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1 not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646034 4773 scope.go:117] "RemoveContainer" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.646277 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862\": container with ID starting with 2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862 not found: ID does not exist" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646299 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862"} err="failed to get container status \"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862\": rpc error: code = NotFound desc = could not find container \"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862\": container with ID starting with 2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862 not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646314 4773 scope.go:117] "RemoveContainer" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.646615 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e\": container with ID starting with 7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e not found: ID does not exist" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646637 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e"} err="failed to get container status \"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e\": rpc error: code = NotFound desc = could not find container \"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e\": container with ID starting with 7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646652 4773 scope.go:117] "RemoveContainer" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.646886 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77\": container with ID starting with 4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77 not found: ID does not exist" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.647769 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77"} err="failed to get container status \"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77\": rpc error: code = NotFound desc = could not find container \"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77\": container with ID starting with 4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77 not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.714750 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715173 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715248 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715360 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715417 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816804 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816860 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816970 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817711 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.818363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.821034 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.821701 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.821954 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.830338 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.835771 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.942153 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:19 crc kubenswrapper[4773]: I0120 18:50:19.376969 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:19 crc kubenswrapper[4773]: I0120 18:50:19.459647 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5814cea-a704-4de4-9205-d65cde58c777" path="/var/lib/kubelet/pods/f5814cea-a704-4de4-9205-d65cde58c777/volumes" Jan 20 18:50:19 crc kubenswrapper[4773]: I0120 18:50:19.570750 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"e5ac9b64580a1a30d80a522428972a274182638cad0934f179acc61a069da4b1"} Jan 20 18:50:20 crc kubenswrapper[4773]: I0120 18:50:20.578381 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275"} Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011482 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-conmon-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-conmon-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011846 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice/crio-7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice/crio-7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011880 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-conmon-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-conmon-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011901 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011924 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011965 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.012004 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-conmon-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-conmon-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.012044 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: E0120 18:50:21.227558 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice/crio-b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice/crio-b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49df8cea_026f_497b_baae_a6a09452aa3d.slice/crio-conmon-7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice/crio-33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice/crio-efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice/crio-conmon-2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice/crio-conmon-33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice/crio-2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e.scope\": RecentStats: unable to find data in memory cache]" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.513447 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590349 4773 generic.go:334] "Generic (PLEG): container finished" podID="49df8cea-026f-497b-baae-a6a09452aa3d" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" exitCode=137 Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590411 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590417 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerDied","Data":"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909"} Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerDied","Data":"45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a"} Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590569 4773 scope.go:117] "RemoveContainer" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.592472 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651"} Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671303 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671397 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671532 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671557 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671600 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671648 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.674568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs" (OuterVolumeSpecName: "logs") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.677248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl" (OuterVolumeSpecName: "kube-api-access-mhdcl") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "kube-api-access-mhdcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.677664 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.699764 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts" (OuterVolumeSpecName: "scripts") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.705133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data" (OuterVolumeSpecName: "config-data") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.723477 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.734503 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.760153 4773 scope.go:117] "RemoveContainer" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.773967 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774008 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774023 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774034 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774045 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774057 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774067 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809002 4773 scope.go:117] "RemoveContainer" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" Jan 20 18:50:21 crc kubenswrapper[4773]: E0120 18:50:21.809467 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4\": container with ID starting with aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4 not found: ID does not exist" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809502 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4"} err="failed to get container status \"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4\": rpc error: code = NotFound desc = could not find container \"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4\": container with ID starting with aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4 not found: ID does not exist" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809520 4773 scope.go:117] "RemoveContainer" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" Jan 20 18:50:21 crc kubenswrapper[4773]: E0120 18:50:21.809867 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909\": container with ID starting with 7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909 not found: ID does not exist" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809918 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909"} err="failed to get container status \"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909\": rpc error: code = NotFound desc = could not find container \"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909\": container with ID starting with 7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909 not found: ID does not exist" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.922123 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.929579 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:50:22 crc kubenswrapper[4773]: I0120 18:50:22.602893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8"} Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.458143 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" path="/var/lib/kubelet/pods/49df8cea-026f-497b-baae-a6a09452aa3d/volumes" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.599523 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 18:50:23 crc kubenswrapper[4773]: E0120 18:50:23.600354 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600372 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" Jan 20 18:50:23 crc kubenswrapper[4773]: E0120 18:50:23.600402 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600409 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600729 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600751 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.601487 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.619545 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.620233 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bxxbt" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.620346 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.659719 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713183 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713242 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713299 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713542 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.815567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.816407 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.816456 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.816673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.822339 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.822585 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.822587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.833702 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.965300 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.472887 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.649671 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2"} Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.650916 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.652023 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerStarted","Data":"c5dfab356b97be1469df8fe93ed3098db2831ed8b05cbbfa693e39760e22278b"} Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.676748 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.604609129 podStartE2EDuration="6.67672198s" podCreationTimestamp="2026-01-20 18:50:18 +0000 UTC" firstStartedPulling="2026-01-20 18:50:19.388058042 +0000 UTC m=+1212.309871056" lastFinishedPulling="2026-01-20 18:50:23.460170883 +0000 UTC m=+1216.381983907" observedRunningTime="2026-01-20 18:50:24.672491709 +0000 UTC m=+1217.594304733" watchObservedRunningTime="2026-01-20 18:50:24.67672198 +0000 UTC m=+1217.598535024" Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.710783 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711531 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" containerID="cri-o://3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275" gracePeriod=30 Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711639 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" containerID="cri-o://ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2" gracePeriod=30 Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711676 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" containerID="cri-o://3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8" gracePeriod=30 Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711709 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" containerID="cri-o://89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651" gracePeriod=30 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731047 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2" exitCode=0 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731363 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8" exitCode=2 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731374 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275" exitCode=0 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2"} Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8"} Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275"} Jan 20 18:50:33 crc kubenswrapper[4773]: I0120 18:50:33.757225 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651" exitCode=0 Jan 20 18:50:33 crc kubenswrapper[4773]: I0120 18:50:33.757312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651"} Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.414393 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575666 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575710 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575760 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575883 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575908 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575963 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.576438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.576531 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.577124 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.577144 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.581397 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x" (OuterVolumeSpecName: "kube-api-access-8lr7x") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "kube-api-access-8lr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.581630 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts" (OuterVolumeSpecName: "scripts") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.602340 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.640862 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.672294 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data" (OuterVolumeSpecName: "config-data") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678560 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678592 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678604 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678613 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678622 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.778810 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerStarted","Data":"ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d"} Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.782631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"e5ac9b64580a1a30d80a522428972a274182638cad0934f179acc61a069da4b1"} Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.782680 4773 scope.go:117] "RemoveContainer" containerID="ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.782786 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.804780 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" podStartSLOduration=2.049423102 podStartE2EDuration="12.804760715s" podCreationTimestamp="2026-01-20 18:50:23 +0000 UTC" firstStartedPulling="2026-01-20 18:50:24.490117729 +0000 UTC m=+1217.411930753" lastFinishedPulling="2026-01-20 18:50:35.245455342 +0000 UTC m=+1228.167268366" observedRunningTime="2026-01-20 18:50:35.799636802 +0000 UTC m=+1228.721449826" watchObservedRunningTime="2026-01-20 18:50:35.804760715 +0000 UTC m=+1228.726573739" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.807064 4773 scope.go:117] "RemoveContainer" containerID="3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.835134 4773 scope.go:117] "RemoveContainer" containerID="89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.845539 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.857823 4773 scope.go:117] "RemoveContainer" containerID="3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.858004 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872041 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872471 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872491 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872513 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872521 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872539 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872546 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872562 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872570 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872817 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872842 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872851 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872861 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.874533 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.877077 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.877302 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.881772 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983033 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983229 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983257 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983429 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085178 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085196 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085299 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.086107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.089304 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.089503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.089596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.090076 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.102201 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.192939 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: W0120 18:50:36.621214 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72607859_440f_410c_baaf_0bef4e81dc3c.slice/crio-9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7 WatchSource:0}: Error finding container 9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7: Status 404 returned error can't find the container with id 9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7 Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.622088 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.792132 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7"} Jan 20 18:50:37 crc kubenswrapper[4773]: I0120 18:50:37.458811 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" path="/var/lib/kubelet/pods/06b19db9-fa8b-46db-a1fd-204fd44c86a5/volumes" Jan 20 18:50:37 crc kubenswrapper[4773]: I0120 18:50:37.801360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} Jan 20 18:50:38 crc kubenswrapper[4773]: I0120 18:50:38.811891 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} Jan 20 18:50:39 crc kubenswrapper[4773]: I0120 18:50:39.827019 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} Jan 20 18:50:41 crc kubenswrapper[4773]: I0120 18:50:41.844836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} Jan 20 18:50:41 crc kubenswrapper[4773]: I0120 18:50:41.846530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:50:41 crc kubenswrapper[4773]: I0120 18:50:41.865538 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.516273181 podStartE2EDuration="6.865519404s" podCreationTimestamp="2026-01-20 18:50:35 +0000 UTC" firstStartedPulling="2026-01-20 18:50:36.623305286 +0000 UTC m=+1229.545118310" lastFinishedPulling="2026-01-20 18:50:40.972551509 +0000 UTC m=+1233.894364533" observedRunningTime="2026-01-20 18:50:41.863347542 +0000 UTC m=+1234.785160586" watchObservedRunningTime="2026-01-20 18:50:41.865519404 +0000 UTC m=+1234.787332428" Jan 20 18:50:42 crc kubenswrapper[4773]: I0120 18:50:42.318749 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.858416 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" containerID="cri-o://82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" gracePeriod=30 Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.858428 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" containerID="cri-o://425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" gracePeriod=30 Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.858428 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" containerID="cri-o://ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" gracePeriod=30 Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.859787 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" containerID="cri-o://b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" gracePeriod=30 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.770720 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876204 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" exitCode=0 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876236 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" exitCode=2 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876246 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" exitCode=0 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876257 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" exitCode=0 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876293 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876314 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876415 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876446 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.880762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.880959 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881019 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881060 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881137 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881762 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.882536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.886415 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts" (OuterVolumeSpecName: "scripts") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.890111 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw" (OuterVolumeSpecName: "kube-api-access-85lhw") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "kube-api-access-85lhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.898066 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.910372 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.944142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.976133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data" (OuterVolumeSpecName: "config-data") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983584 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983609 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983618 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983628 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983638 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983646 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983657 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.984841 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.004267 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.021873 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.022376 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022431 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022463 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.022946 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022980 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022995 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.023331 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023358 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023412 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.023672 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023698 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023717 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024012 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024035 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024535 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024580 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024925 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.025043 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026365 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026396 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026633 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026660 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026897 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026952 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027244 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027267 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027473 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027494 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027800 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027826 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028062 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028086 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028293 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028347 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028637 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.228800 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.245534 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.271650 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.272523 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.273041 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.274033 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274044 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.274063 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274138 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.274176 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274182 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274575 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274594 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274609 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274618 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.284172 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.287599 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.288025 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.295168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391280 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391767 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391872 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391993 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.392078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.392131 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493720 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493868 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.494014 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.494060 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.494630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.496065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.498562 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.498618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.498866 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.509816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.515747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.605654 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.046386 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.464794 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" path="/var/lib/kubelet/pods/72607859-440f-410c-baaf-0bef4e81dc3c/volumes" Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.896384 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086"} Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.896983 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"54c5a63008f7d55fe728b88afe044f2f142c0ed42ff2bb1fe5fb615542c76281"} Jan 20 18:50:48 crc kubenswrapper[4773]: I0120 18:50:48.905433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643"} Jan 20 18:50:49 crc kubenswrapper[4773]: I0120 18:50:49.915627 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c"} Jan 20 18:50:49 crc kubenswrapper[4773]: I0120 18:50:49.918222 4773 generic.go:334] "Generic (PLEG): container finished" podID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerID="ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d" exitCode=0 Jan 20 18:50:49 crc kubenswrapper[4773]: I0120 18:50:49.918363 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerDied","Data":"ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d"} Jan 20 18:50:50 crc kubenswrapper[4773]: I0120 18:50:50.929575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b"} Jan 20 18:50:50 crc kubenswrapper[4773]: I0120 18:50:50.969354 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.494172565 podStartE2EDuration="4.969333212s" podCreationTimestamp="2026-01-20 18:50:46 +0000 UTC" firstStartedPulling="2026-01-20 18:50:47.063215114 +0000 UTC m=+1239.985028138" lastFinishedPulling="2026-01-20 18:50:50.538375761 +0000 UTC m=+1243.460188785" observedRunningTime="2026-01-20 18:50:50.954881796 +0000 UTC m=+1243.876694900" watchObservedRunningTime="2026-01-20 18:50:50.969333212 +0000 UTC m=+1243.891146226" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.304783 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475435 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475538 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475665 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475701 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.481303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts" (OuterVolumeSpecName: "scripts") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.481830 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd" (OuterVolumeSpecName: "kube-api-access-mqrtd") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "kube-api-access-mqrtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.502723 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.503725 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data" (OuterVolumeSpecName: "config-data") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578175 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578213 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578230 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578240 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944182 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerDied","Data":"c5dfab356b97be1469df8fe93ed3098db2831ed8b05cbbfa693e39760e22278b"} Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944570 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5dfab356b97be1469df8fe93ed3098db2831ed8b05cbbfa693e39760e22278b" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944594 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.039175 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 20 18:50:52 crc kubenswrapper[4773]: E0120 18:50:52.039541 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerName="nova-cell0-conductor-db-sync" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.039559 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerName="nova-cell0-conductor-db-sync" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.039742 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerName="nova-cell0-conductor-db-sync" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.040349 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.042835 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bxxbt" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.043049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.049634 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.189614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.189658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gb9\" (UniqueName: \"kubernetes.io/projected/b123d99d-6cf6-4516-a5ae-7dcdf8262269-kube-api-access-66gb9\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.189706 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.290877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.290963 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gb9\" (UniqueName: \"kubernetes.io/projected/b123d99d-6cf6-4516-a5ae-7dcdf8262269-kube-api-access-66gb9\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.292117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.298206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.303539 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.310035 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gb9\" (UniqueName: \"kubernetes.io/projected/b123d99d-6cf6-4516-a5ae-7dcdf8262269-kube-api-access-66gb9\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.357500 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.779595 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.949528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b123d99d-6cf6-4516-a5ae-7dcdf8262269","Type":"ContainerStarted","Data":"445c8401390dc1adbdebe6fc61f046d0afd7a7608b18bab7d7255ed2043b193c"} Jan 20 18:50:53 crc kubenswrapper[4773]: I0120 18:50:53.970020 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b123d99d-6cf6-4516-a5ae-7dcdf8262269","Type":"ContainerStarted","Data":"6e77b1c0d9b3b5447f7c98b4543a70f7a070b707e881dc87a843481b5e1cc0a4"} Jan 20 18:50:53 crc kubenswrapper[4773]: I0120 18:50:53.970384 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:53 crc kubenswrapper[4773]: I0120 18:50:53.989921 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.9899028520000002 podStartE2EDuration="1.989902852s" podCreationTimestamp="2026-01-20 18:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:53.986108031 +0000 UTC m=+1246.907921075" watchObservedRunningTime="2026-01-20 18:50:53.989902852 +0000 UTC m=+1246.911715896" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.382064 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.863578 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.864850 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.867703 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.874861 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.875647 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998153 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998232 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998326 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.038957 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.040025 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.044468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.051027 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.097842 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.099537 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100251 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100364 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100486 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.108173 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.108585 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.112414 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.113671 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.114310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.136561 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.204661 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206109 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206217 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206270 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206419 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206493 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.247494 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.252062 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.254404 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.289306 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308860 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308922 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.313911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.316421 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.322317 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.323542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.326804 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.327984 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.339618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.344704 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.348628 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.349403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.350796 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.358190 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.361257 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.418383 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.418433 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.418489 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.422173 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.423573 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.460974 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.525946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.529967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530385 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.538097 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.538601 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.561621 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634699 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634726 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634757 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634795 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634838 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.635948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636223 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636594 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.641838 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.646630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.684631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.684729 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.689570 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.718419 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.764644 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.841198 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.964333 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.029413 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerStarted","Data":"c91e5fa12c0b7ce3e47c60fb183a0b0468674854812e9596d10c1c95981af1d7"} Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.031858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerStarted","Data":"11bde0588e2d0753560751b7629f6dce99bb8a4665740e87d409f7d0f08a5c63"} Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.053399 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.054745 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.057278 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.057286 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.066272 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.079250 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.215589 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.243391 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.249771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.250565 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.250806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.250892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352014 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352132 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352203 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.361913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.362284 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.372760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.380208 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.509423 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.680610 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.040352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerStarted","Data":"135e3a1a083b2b65bf6bedbd7f61f04220039e408cb36b8c474ab0c91ec914a7"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.041327 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerStarted","Data":"a26dde9426d7bf6401c218a90f41c5f6ca8484f70a01b6ffde63301f200a5f7e"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.043013 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerStarted","Data":"8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.045220 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerStarted","Data":"18a3bac927d957a20e3b22301a468180e092097c9b7ed08358a71dea44a2f4fb"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.046162 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerStarted","Data":"2a35ed65f92f9e8ee824da1249f99dade421d53e6d45bbd8bdafdaa51537c79e"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.066569 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-p6rjg" podStartSLOduration=3.066547582 podStartE2EDuration="3.066547582s" podCreationTimestamp="2026-01-20 18:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:00.057608768 +0000 UTC m=+1252.979421792" watchObservedRunningTime="2026-01-20 18:51:00.066547582 +0000 UTC m=+1252.988360616" Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.162370 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 18:51:01 crc kubenswrapper[4773]: I0120 18:51:01.057908 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerStarted","Data":"c2e8a7f30db310da621535c0d420a0bac1190bc2a41a699499456b5aa096a1da"} Jan 20 18:51:01 crc kubenswrapper[4773]: I0120 18:51:01.935620 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:01 crc kubenswrapper[4773]: I0120 18:51:01.952879 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.068836 4773 generic.go:334] "Generic (PLEG): container finished" podID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" exitCode=0 Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.068895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerDied","Data":"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263"} Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.074182 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerStarted","Data":"eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6"} Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.175051 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" podStartSLOduration=3.175033029 podStartE2EDuration="3.175033029s" podCreationTimestamp="2026-01-20 18:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:02.128780873 +0000 UTC m=+1255.050593897" watchObservedRunningTime="2026-01-20 18:51:02.175033029 +0000 UTC m=+1255.096846053" Jan 20 18:51:03 crc kubenswrapper[4773]: I0120 18:51:03.085959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerStarted","Data":"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1"} Jan 20 18:51:03 crc kubenswrapper[4773]: I0120 18:51:03.086050 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:03 crc kubenswrapper[4773]: I0120 18:51:03.105042 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" podStartSLOduration=5.10502068 podStartE2EDuration="5.10502068s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:03.104361615 +0000 UTC m=+1256.026174639" watchObservedRunningTime="2026-01-20 18:51:03.10502068 +0000 UTC m=+1256.026833704" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.122287 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerStarted","Data":"d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.124181 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerStarted","Data":"df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.124223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerStarted","Data":"c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerStarted","Data":"671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerStarted","Data":"277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126604 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" containerID="cri-o://277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254" gracePeriod=30 Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126630 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" containerID="cri-o://671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c" gracePeriod=30 Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.131434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerStarted","Data":"49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.131558 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e" gracePeriod=30 Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.156752 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.060681417 podStartE2EDuration="9.156735342s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:59.222829514 +0000 UTC m=+1252.144642538" lastFinishedPulling="2026-01-20 18:51:05.318883439 +0000 UTC m=+1258.240696463" observedRunningTime="2026-01-20 18:51:07.153384822 +0000 UTC m=+1260.075197856" watchObservedRunningTime="2026-01-20 18:51:07.156735342 +0000 UTC m=+1260.078548366" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.180256 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.834797263 podStartE2EDuration="9.180236894s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:58.975334774 +0000 UTC m=+1251.897147798" lastFinishedPulling="2026-01-20 18:51:05.320774405 +0000 UTC m=+1258.242587429" observedRunningTime="2026-01-20 18:51:07.172471478 +0000 UTC m=+1260.094284522" watchObservedRunningTime="2026-01-20 18:51:07.180236894 +0000 UTC m=+1260.102049918" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.200360 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.134334489 podStartE2EDuration="9.200340875s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:59.251841889 +0000 UTC m=+1252.173654913" lastFinishedPulling="2026-01-20 18:51:05.317848275 +0000 UTC m=+1258.239661299" observedRunningTime="2026-01-20 18:51:07.196700398 +0000 UTC m=+1260.118513422" watchObservedRunningTime="2026-01-20 18:51:07.200340875 +0000 UTC m=+1260.122153899" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.219522 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9697726429999998 podStartE2EDuration="9.219498884s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:59.067956349 +0000 UTC m=+1251.989769363" lastFinishedPulling="2026-01-20 18:51:05.31768258 +0000 UTC m=+1258.239495604" observedRunningTime="2026-01-20 18:51:07.217557466 +0000 UTC m=+1260.139370490" watchObservedRunningTime="2026-01-20 18:51:07.219498884 +0000 UTC m=+1260.141311908" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.152305 4773 generic.go:334] "Generic (PLEG): container finished" podID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerID="8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15" exitCode=0 Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.152421 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerDied","Data":"8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15"} Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.154650 4773 generic.go:334] "Generic (PLEG): container finished" podID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerID="671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c" exitCode=0 Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.154665 4773 generic.go:334] "Generic (PLEG): container finished" podID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerID="277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254" exitCode=143 Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.155465 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerDied","Data":"671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c"} Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.155491 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerDied","Data":"277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254"} Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.359625 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.484581 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.527101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.527480 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.659827 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.659944 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.659971 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.660008 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.661373 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs" (OuterVolumeSpecName: "logs") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.665837 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz" (OuterVolumeSpecName: "kube-api-access-7srwz") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "kube-api-access-7srwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.690967 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.691004 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.691084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.691104 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data" (OuterVolumeSpecName: "config-data") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.726621 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761616 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761655 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761669 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761679 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.766108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.828423 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.829221 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" containerID="cri-o://9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323" gracePeriod=10 Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.167858 4773 generic.go:334] "Generic (PLEG): container finished" podID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerID="9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323" exitCode=0 Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.167916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerDied","Data":"9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323"} Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.170145 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.172826 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerDied","Data":"135e3a1a083b2b65bf6bedbd7f61f04220039e408cb36b8c474ab0c91ec914a7"} Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.172872 4773 scope.go:117] "RemoveContainer" containerID="671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.226653 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.227304 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.231529 4773 scope.go:117] "RemoveContainer" containerID="277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.241806 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283027 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: E0120 18:51:09.283427 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283443 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" Jan 20 18:51:09 crc kubenswrapper[4773]: E0120 18:51:09.283477 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283486 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283676 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283719 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.284811 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.288149 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.292543 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.302513 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.389660 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.459116 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" path="/var/lib/kubelet/pods/6709a6f8-82d5-4b67-b4db-a35f9e88a664/volumes" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486143 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486279 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587774 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587845 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587916 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588011 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588695 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588790 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.589266 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.594268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.595062 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf" (OuterVolumeSpecName: "kube-api-access-99tvf") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "kube-api-access-99tvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.603697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610131 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.176:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610367 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.176:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610718 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.642918 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.643258 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.648774 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.650020 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config" (OuterVolumeSpecName: "config") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.686194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694298 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694486 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694586 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694678 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694758 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.179060 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:10 crc kubenswrapper[4773]: W0120 18:51:10.193009 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode93e7391_bd5a_45f1_b0cd_15f8f39ba094.slice/crio-3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7 WatchSource:0}: Error finding container 3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7: Status 404 returned error can't find the container with id 3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7 Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.220120 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerDied","Data":"61e0e72231f57915e14eb84dde43eba2a10986a7eb1f548177c2131ae5e71eff"} Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.220172 4773 scope.go:117] "RemoveContainer" containerID="9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.220262 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.307689 4773 scope.go:117] "RemoveContainer" containerID="5b0075b2e498f8ecabb538d3ba4a3dfe5c7da84ee028f9b1a8729de23849abd7" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.311749 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.330228 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.384517 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515638 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515696 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515742 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515789 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.520651 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p" (OuterVolumeSpecName: "kube-api-access-zxx8p") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "kube-api-access-zxx8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.521170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts" (OuterVolumeSpecName: "scripts") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.541981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data" (OuterVolumeSpecName: "config-data") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.548566 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617783 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617819 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617837 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617848 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.268912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerStarted","Data":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.269278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerStarted","Data":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.269296 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerStarted","Data":"3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.270620 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.270611 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerDied","Data":"11bde0588e2d0753560751b7629f6dce99bb8a4665740e87d409f7d0f08a5c63"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.270680 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11bde0588e2d0753560751b7629f6dce99bb8a4665740e87d409f7d0f08a5c63" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.288486 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.288466848 podStartE2EDuration="2.288466848s" podCreationTimestamp="2026-01-20 18:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:11.287043514 +0000 UTC m=+1264.208856548" watchObservedRunningTime="2026-01-20 18:51:11.288466848 +0000 UTC m=+1264.210279882" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.474600 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" path="/var/lib/kubelet/pods/93029dbe-6bb4-45aa-a72a-13e4ffc2537e/volumes" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.572367 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.572590 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" containerID="cri-o://c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc" gracePeriod=30 Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.573066 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" containerID="cri-o://df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e" gracePeriod=30 Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.585470 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.597764 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.280441 4773 generic.go:334] "Generic (PLEG): container finished" podID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerID="eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6" exitCode=0 Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.280538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerDied","Data":"eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6"} Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.283036 4773 generic.go:334] "Generic (PLEG): container finished" podID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerID="c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc" exitCode=143 Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.283142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerDied","Data":"c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc"} Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.283408 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" containerID="cri-o://d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903" gracePeriod=30 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.292796 4773 generic.go:334] "Generic (PLEG): container finished" podID="770606ab-65d2-4537-a335-6953af47241a" containerID="d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903" exitCode=0 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.292874 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerDied","Data":"d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903"} Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.293859 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" containerID="cri-o://c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" gracePeriod=30 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.294059 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" containerID="cri-o://7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" gracePeriod=30 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.573082 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.672323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"770606ab-65d2-4537-a335-6953af47241a\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.672414 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"770606ab-65d2-4537-a335-6953af47241a\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.672507 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"770606ab-65d2-4537-a335-6953af47241a\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.678578 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz" (OuterVolumeSpecName: "kube-api-access-lk4pz") pod "770606ab-65d2-4537-a335-6953af47241a" (UID: "770606ab-65d2-4537-a335-6953af47241a"). InnerVolumeSpecName "kube-api-access-lk4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.699651 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "770606ab-65d2-4537-a335-6953af47241a" (UID: "770606ab-65d2-4537-a335-6953af47241a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.707298 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data" (OuterVolumeSpecName: "config-data") pod "770606ab-65d2-4537-a335-6953af47241a" (UID: "770606ab-65d2-4537-a335-6953af47241a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.715659 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.774657 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.774689 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.774701 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.849870 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876391 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876529 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.880268 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts" (OuterVolumeSpecName: "scripts") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.880793 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2" (OuterVolumeSpecName: "kube-api-access-gzbk2") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "kube-api-access-gzbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.901235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data" (OuterVolumeSpecName: "config-data") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.902766 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.977888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.977984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978053 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978126 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978194 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978630 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978646 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978655 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978664 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978960 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs" (OuterVolumeSpecName: "logs") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.981546 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9" (OuterVolumeSpecName: "kube-api-access-rwkk9") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "kube-api-access-rwkk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.002025 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data" (OuterVolumeSpecName: "config-data") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.005282 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.027078 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080040 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080093 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080109 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080120 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080132 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.304984 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.304979 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerDied","Data":"c2e8a7f30db310da621535c0d420a0bac1190bc2a41a699499456b5aa096a1da"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.305166 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e8a7f30db310da621535c0d420a0bac1190bc2a41a699499456b5aa096a1da" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.307622 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerDied","Data":"18a3bac927d957a20e3b22301a468180e092097c9b7ed08358a71dea44a2f4fb"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.307682 4773 scope.go:117] "RemoveContainer" containerID="d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.307819 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315437 4773 generic.go:334] "Generic (PLEG): container finished" podID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" exitCode=0 Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315493 4773 generic.go:334] "Generic (PLEG): container finished" podID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" exitCode=143 Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315518 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerDied","Data":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315547 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerDied","Data":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerDied","Data":"3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315632 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.348759 4773 scope.go:117] "RemoveContainer" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.467527 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468157 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerName="nova-cell1-conductor-db-sync" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468175 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerName="nova-cell1-conductor-db-sync" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468190 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="init" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468196 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="init" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468206 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468212 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468223 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468229 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468242 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468247 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468259 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerName="nova-manage" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468264 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerName="nova-manage" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468283 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468289 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468431 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerName="nova-manage" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468457 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerName="nova-cell1-conductor-db-sync" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468465 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468480 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468495 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468516 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.469081 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.477881 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.478005 4773 scope.go:117] "RemoveContainer" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.500222 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521071 4773 scope.go:117] "RemoveContainer" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521311 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.521890 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": container with ID starting with 7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b not found: ID does not exist" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521916 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} err="failed to get container status \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": rpc error: code = NotFound desc = could not find container \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": container with ID starting with 7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521959 4773 scope.go:117] "RemoveContainer" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.522220 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": container with ID starting with c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0 not found: ID does not exist" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522242 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} err="failed to get container status \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": rpc error: code = NotFound desc = could not find container \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": container with ID starting with c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0 not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522256 4773 scope.go:117] "RemoveContainer" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522502 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} err="failed to get container status \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": rpc error: code = NotFound desc = could not find container \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": container with ID starting with 7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522516 4773 scope.go:117] "RemoveContainer" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522697 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} err="failed to get container status \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": rpc error: code = NotFound desc = could not find container \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": container with ID starting with c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0 not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.540257 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.554596 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.556490 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.560631 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.561139 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.572137 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.579165 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.596893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/7970e552-0aac-436b-ba20-4810e82dcd20-kube-api-access-rvdlt\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.596981 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.597004 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.611335 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.613673 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.616179 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.616966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.621113 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698839 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698891 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/7970e552-0aac-436b-ba20-4810e82dcd20-kube-api-access-rvdlt\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698944 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698966 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.699039 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.699112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.703832 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.705500 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.715165 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/7970e552-0aac-436b-ba20-4810e82dcd20-kube-api-access-rvdlt\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.797531 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800375 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800499 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800559 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.803563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.803910 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.825177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.883164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903520 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903545 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.904275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.909067 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.913068 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.913859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.923220 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.931514 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.248806 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.327750 4773 generic.go:334] "Generic (PLEG): container finished" podID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerID="df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e" exitCode=0 Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.327793 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerDied","Data":"df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e"} Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.331710 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7970e552-0aac-436b-ba20-4810e82dcd20","Type":"ContainerStarted","Data":"2ab19617b49acc68e33070c92ef95686d716679192c7e1bb7c90563b37a8db42"} Jan 20 18:51:15 crc kubenswrapper[4773]: W0120 18:51:15.385332 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9c6096_2ce8_4b43_a638_50374d21d621.slice/crio-1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a WatchSource:0}: Error finding container 1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a: Status 404 returned error can't find the container with id 1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.387803 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.465389 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770606ab-65d2-4537-a335-6953af47241a" path="/var/lib/kubelet/pods/770606ab-65d2-4537-a335-6953af47241a/volumes" Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.466175 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" path="/var/lib/kubelet/pods/e93e7391-bd5a-45f1-b0cd-15f8f39ba094/volumes" Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.466889 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.286605 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.341196 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerDied","Data":"2a35ed65f92f9e8ee824da1249f99dade421d53e6d45bbd8bdafdaa51537c79e"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.341232 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.341261 4773 scope.go:117] "RemoveContainer" containerID="df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.343631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerStarted","Data":"1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.345528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7970e552-0aac-436b-ba20-4810e82dcd20","Type":"ContainerStarted","Data":"6c46258b994452cf856ed0515ba72c98f2dc18e32acd58b9888553b3e54d162a"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.346409 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerStarted","Data":"220ce0272e6c403af39eafd69462c79884cab6d380e200929ec876ec12a03a06"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.362280 4773 scope.go:117] "RemoveContainer" containerID="c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426510 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426552 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426627 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.427845 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs" (OuterVolumeSpecName: "logs") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.430876 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk" (OuterVolumeSpecName: "kube-api-access-xkqxk") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "kube-api-access-xkqxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.457661 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data" (OuterVolumeSpecName: "config-data") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.462701 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528636 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528678 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528691 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528708 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.620374 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.718043 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.733158 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.741742 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: E0120 18:51:16.742349 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742376 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" Jan 20 18:51:16 crc kubenswrapper[4773]: E0120 18:51:16.742403 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742412 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742695 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742719 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.747340 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.751187 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.753036 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833262 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833317 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.934817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.934864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.934989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.935015 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.935727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.943602 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.944397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.956113 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.066215 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.359163 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerStarted","Data":"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14"} Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.359206 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerStarted","Data":"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d"} Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.363307 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerStarted","Data":"52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b"} Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.366312 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.387145 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.387120615 podStartE2EDuration="3.387120615s" podCreationTimestamp="2026-01-20 18:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:17.381782037 +0000 UTC m=+1270.303595061" watchObservedRunningTime="2026-01-20 18:51:17.387120615 +0000 UTC m=+1270.308933639" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.401273 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.401230892 podStartE2EDuration="3.401230892s" podCreationTimestamp="2026-01-20 18:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:17.399004589 +0000 UTC m=+1270.320817633" watchObservedRunningTime="2026-01-20 18:51:17.401230892 +0000 UTC m=+1270.323043916" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.422874 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.422847219 podStartE2EDuration="3.422847219s" podCreationTimestamp="2026-01-20 18:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:17.414663833 +0000 UTC m=+1270.336476877" watchObservedRunningTime="2026-01-20 18:51:17.422847219 +0000 UTC m=+1270.344660243" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.457867 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" path="/var/lib/kubelet/pods/b8ddb59d-f815-43b0-8d46-31575ad7703f/volumes" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.520213 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:17 crc kubenswrapper[4773]: W0120 18:51:17.528571 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f3317e_9e4d_442d_a5b2_d9633262f332.slice/crio-4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80 WatchSource:0}: Error finding container 4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80: Status 404 returned error can't find the container with id 4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80 Jan 20 18:51:18 crc kubenswrapper[4773]: I0120 18:51:18.385643 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerStarted","Data":"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25"} Jan 20 18:51:18 crc kubenswrapper[4773]: I0120 18:51:18.385998 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerStarted","Data":"4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80"} Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.394777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerStarted","Data":"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041"} Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.431496 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.431474128 podStartE2EDuration="3.431474128s" podCreationTimestamp="2026-01-20 18:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:19.416706134 +0000 UTC m=+1272.338519158" watchObservedRunningTime="2026-01-20 18:51:19.431474128 +0000 UTC m=+1272.353287142" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.439872 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.440088 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" containerID="cri-o://0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" gracePeriod=30 Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.883814 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.921908 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.931695 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.931747 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.987790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.993826 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh" (OuterVolumeSpecName: "kube-api-access-jjqgh") pod "d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" (UID: "d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae"). InnerVolumeSpecName "kube-api-access-jjqgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.090363 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403502 4773 generic.go:334] "Generic (PLEG): container finished" podID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" exitCode=2 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403571 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerDied","Data":"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382"} Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerDied","Data":"381de174237c80b24a95594eb30259e92e84f6fa102ffa5688eefcf07e0ea711"} Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403916 4773 scope.go:117] "RemoveContainer" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.434385 4773 scope.go:117] "RemoveContainer" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" Jan 20 18:51:20 crc kubenswrapper[4773]: E0120 18:51:20.435249 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382\": container with ID starting with 0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382 not found: ID does not exist" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.435285 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382"} err="failed to get container status \"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382\": rpc error: code = NotFound desc = could not find container \"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382\": container with ID starting with 0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382 not found: ID does not exist" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.435998 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436254 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" containerID="cri-o://9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436566 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" containerID="cri-o://4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436610 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" containerID="cri-o://1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436656 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" containerID="cri-o://331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.470421 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.486189 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.494698 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: E0120 18:51:20.495215 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.495236 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.495463 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.496088 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.502020 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.514968 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.514968 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597562 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rw2t\" (UniqueName: \"kubernetes.io/projected/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-api-access-6rw2t\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698910 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rw2t\" (UniqueName: \"kubernetes.io/projected/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-api-access-6rw2t\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698974 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.703692 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.703717 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.704326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.717589 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rw2t\" (UniqueName: \"kubernetes.io/projected/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-api-access-6rw2t\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.834900 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.309893 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:21 crc kubenswrapper[4773]: W0120 18:51:21.323761 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e2f1ada_ddef_454d_bdb7_fd695ee8f4ea.slice/crio-c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde WatchSource:0}: Error finding container c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde: Status 404 returned error can't find the container with id c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414212 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" exitCode=0 Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414241 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" exitCode=2 Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414249 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" exitCode=0 Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414285 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414477 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.415591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea","Type":"ContainerStarted","Data":"c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.457176 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" path="/var/lib/kubelet/pods/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae/volumes" Jan 20 18:51:22 crc kubenswrapper[4773]: I0120 18:51:22.426460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea","Type":"ContainerStarted","Data":"569a5f5031a6b3558413fdead634a21c6b98e859bb254fb92034bc49c971f93b"} Jan 20 18:51:23 crc kubenswrapper[4773]: I0120 18:51:23.433581 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 20 18:51:23 crc kubenswrapper[4773]: I0120 18:51:23.461106 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.811110049 podStartE2EDuration="3.461086931s" podCreationTimestamp="2026-01-20 18:51:20 +0000 UTC" firstStartedPulling="2026-01-20 18:51:21.326125419 +0000 UTC m=+1274.247938443" lastFinishedPulling="2026-01-20 18:51:21.976102301 +0000 UTC m=+1274.897915325" observedRunningTime="2026-01-20 18:51:23.450558088 +0000 UTC m=+1276.372371112" watchObservedRunningTime="2026-01-20 18:51:23.461086931 +0000 UTC m=+1276.382899965" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.824321 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.884189 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.910787 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.932014 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.932082 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.434531 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.473826 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" exitCode=0 Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474650 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643"} Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"54c5a63008f7d55fe728b88afe044f2f142c0ed42ff2bb1fe5fb615542c76281"} Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474872 4773 scope.go:117] "RemoveContainer" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.533725 4773 scope.go:117] "RemoveContainer" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.534201 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.575607 4773 scope.go:117] "RemoveContainer" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585711 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585760 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586257 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586604 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586666 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.587881 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.588638 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.591220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd" (OuterVolumeSpecName: "kube-api-access-rj2gd") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "kube-api-access-rj2gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.591318 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts" (OuterVolumeSpecName: "scripts") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.614658 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.614733 4773 scope.go:117] "RemoveContainer" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.685609 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data" (OuterVolumeSpecName: "config-data") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.686893 4773 scope.go:117] "RemoveContainer" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.687398 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b\": container with ID starting with 4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b not found: ID does not exist" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687444 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b"} err="failed to get container status \"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b\": rpc error: code = NotFound desc = could not find container \"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b\": container with ID starting with 4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687465 4773 scope.go:117] "RemoveContainer" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.687830 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c\": container with ID starting with 1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c not found: ID does not exist" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687868 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c"} err="failed to get container status \"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c\": rpc error: code = NotFound desc = could not find container \"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c\": container with ID starting with 1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687887 4773 scope.go:117] "RemoveContainer" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.688115 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643\": container with ID starting with 331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643 not found: ID does not exist" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.688138 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643"} err="failed to get container status \"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643\": rpc error: code = NotFound desc = could not find container \"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643\": container with ID starting with 331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643 not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.688159 4773 scope.go:117] "RemoveContainer" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.688642 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086\": container with ID starting with 9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086 not found: ID does not exist" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.688667 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086"} err="failed to get container status \"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086\": rpc error: code = NotFound desc = could not find container \"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086\": container with ID starting with 9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086 not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690105 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690144 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690162 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690173 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690184 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.696520 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.791925 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.814086 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.822647 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.864745 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.867765 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.867871 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.868034 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880244 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.880338 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880346 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.880366 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880372 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880653 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880667 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880677 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880695 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.882422 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.884436 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.884436 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.884590 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.888439 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.947098 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.947459 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996476 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996540 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996625 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996655 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996752 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098463 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098624 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098672 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098689 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.099139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.099428 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.102677 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.105771 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.115674 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.115897 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.116282 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.124557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.208534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.718716 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:26 crc kubenswrapper[4773]: W0120 18:51:26.726846 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b106f16_e8b7_4cc5_a5be_fba349150373.slice/crio-f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694 WatchSource:0}: Error finding container f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694: Status 404 returned error can't find the container with id f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694 Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.067065 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.067103 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.455720 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" path="/var/lib/kubelet/pods/efb19124-8662-46a5-8fb4-7fbaeba8885a/volumes" Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.503217 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c"} Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.503270 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694"} Jan 20 18:51:28 crc kubenswrapper[4773]: I0120 18:51:28.148206 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:28 crc kubenswrapper[4773]: I0120 18:51:28.148232 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:29 crc kubenswrapper[4773]: I0120 18:51:29.549176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf"} Jan 20 18:51:30 crc kubenswrapper[4773]: I0120 18:51:30.559311 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860"} Jan 20 18:51:30 crc kubenswrapper[4773]: I0120 18:51:30.844954 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 20 18:51:31 crc kubenswrapper[4773]: I0120 18:51:31.569803 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e"} Jan 20 18:51:31 crc kubenswrapper[4773]: I0120 18:51:31.571147 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:51:31 crc kubenswrapper[4773]: I0120 18:51:31.597334 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.300629086 podStartE2EDuration="6.597314149s" podCreationTimestamp="2026-01-20 18:51:25 +0000 UTC" firstStartedPulling="2026-01-20 18:51:26.730518065 +0000 UTC m=+1279.652331089" lastFinishedPulling="2026-01-20 18:51:31.027203128 +0000 UTC m=+1283.949016152" observedRunningTime="2026-01-20 18:51:31.586454019 +0000 UTC m=+1284.508267043" watchObservedRunningTime="2026-01-20 18:51:31.597314149 +0000 UTC m=+1284.519127173" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.937894 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.938361 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.942565 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.944038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.071035 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.072065 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.072638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.072673 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.075345 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.078574 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.307192 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.308793 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.338877 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410371 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410436 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410474 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410513 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.512962 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517344 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517400 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517603 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517659 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.519115 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.519504 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.519802 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.525614 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.547476 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.620674 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerID="49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e" exitCode=137 Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.621431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerDied","Data":"49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e"} Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.636503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.757020 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.830519 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"cc9522a8-e87a-485b-85e6-9548b4f7c835\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.831191 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"cc9522a8-e87a-485b-85e6-9548b4f7c835\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.831299 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"cc9522a8-e87a-485b-85e6-9548b4f7c835\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.838248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9" (OuterVolumeSpecName: "kube-api-access-npbs9") pod "cc9522a8-e87a-485b-85e6-9548b4f7c835" (UID: "cc9522a8-e87a-485b-85e6-9548b4f7c835"). InnerVolumeSpecName "kube-api-access-npbs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.875133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc9522a8-e87a-485b-85e6-9548b4f7c835" (UID: "cc9522a8-e87a-485b-85e6-9548b4f7c835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.891366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data" (OuterVolumeSpecName: "config-data") pod "cc9522a8-e87a-485b-85e6-9548b4f7c835" (UID: "cc9522a8-e87a-485b-85e6-9548b4f7c835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.954228 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.954278 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.954295 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.090791 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.630373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerDied","Data":"c91e5fa12c0b7ce3e47c60fb183a0b0468674854812e9596d10c1c95981af1d7"} Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.630425 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.630769 4773 scope.go:117] "RemoveContainer" containerID="49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.632037 4773 generic.go:334] "Generic (PLEG): container finished" podID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" exitCode=0 Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.632082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerDied","Data":"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd"} Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.633426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerStarted","Data":"d35e4b0ceb4787bad4a95ece01b001352569b9bbc51780aec8a8c24c4fa207e2"} Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.793987 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.802180 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.833167 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: E0120 18:51:38.833704 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.833797 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.834044 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.834673 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.837439 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.837508 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.837445 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.846647 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973778 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsppl\" (UniqueName: \"kubernetes.io/projected/7c5b56a3-1c91-4347-ae44-63f05c35e134-kube-api-access-dsppl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075494 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075554 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsppl\" (UniqueName: \"kubernetes.io/projected/7c5b56a3-1c91-4347-ae44-63f05c35e134-kube-api-access-dsppl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075668 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.081409 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.081465 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.081617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.083373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.092468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsppl\" (UniqueName: \"kubernetes.io/projected/7c5b56a3-1c91-4347-ae44-63f05c35e134-kube-api-access-dsppl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.151527 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.461483 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" path="/var/lib/kubelet/pods/cc9522a8-e87a-485b-85e6-9548b4f7c835/volumes" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.615105 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:39 crc kubenswrapper[4773]: W0120 18:51:39.616395 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c5b56a3_1c91_4347_ae44_63f05c35e134.slice/crio-58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e WatchSource:0}: Error finding container 58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e: Status 404 returned error can't find the container with id 58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.645073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerStarted","Data":"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7"} Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.645311 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.646692 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c5b56a3-1c91-4347-ae44-63f05c35e134","Type":"ContainerStarted","Data":"58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e"} Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.672056 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" podStartSLOduration=2.672037634 podStartE2EDuration="2.672037634s" podCreationTimestamp="2026-01-20 18:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:39.661067022 +0000 UTC m=+1292.582880076" watchObservedRunningTime="2026-01-20 18:51:39.672037634 +0000 UTC m=+1292.593850658" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.738832 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.739045 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" containerID="cri-o://b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.739795 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" containerID="cri-o://f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.932783 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933299 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" containerID="cri-o://9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933328 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" containerID="cri-o://656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933372 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" containerID="cri-o://97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933404 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" containerID="cri-o://774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" gracePeriod=30 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660044 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" exitCode=0 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660083 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" exitCode=2 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660096 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" exitCode=0 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660159 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660169 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.661894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c5b56a3-1c91-4347-ae44-63f05c35e134","Type":"ContainerStarted","Data":"5b7dd7e03c625a8c1bfca8bb788d4e764c64e7659c84950ff30cc14e669adc77"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.665009 4773 generic.go:334] "Generic (PLEG): container finished" podID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" exitCode=143 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.665078 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerDied","Data":"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.681287 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.681270981 podStartE2EDuration="2.681270981s" podCreationTimestamp="2026-01-20 18:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:40.678420573 +0000 UTC m=+1293.600233597" watchObservedRunningTime="2026-01-20 18:51:40.681270981 +0000 UTC m=+1293.603084005" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.586631 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678731 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" exitCode=0 Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678780 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf"} Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678810 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694"} Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678862 4773 scope.go:117] "RemoveContainer" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.698823 4773 scope.go:117] "RemoveContainer" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.725149 4773 scope.go:117] "RemoveContainer" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.747400 4773 scope.go:117] "RemoveContainer" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749389 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749495 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749552 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749708 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749801 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749829 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.750238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.750383 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.750805 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.761190 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts" (OuterVolumeSpecName: "scripts") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.761235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7" (OuterVolumeSpecName: "kube-api-access-7hhn7") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "kube-api-access-7hhn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.807771 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.811010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.822084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854496 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854527 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854536 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854546 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854554 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854563 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.856717 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data" (OuterVolumeSpecName: "config-data") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.926973 4773 scope.go:117] "RemoveContainer" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.927505 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e\": container with ID starting with 656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e not found: ID does not exist" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.927573 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e"} err="failed to get container status \"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e\": rpc error: code = NotFound desc = could not find container \"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e\": container with ID starting with 656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.927614 4773 scope.go:117] "RemoveContainer" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.928023 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860\": container with ID starting with 97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860 not found: ID does not exist" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928074 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860"} err="failed to get container status \"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860\": rpc error: code = NotFound desc = could not find container \"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860\": container with ID starting with 97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860 not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928129 4773 scope.go:117] "RemoveContainer" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.928442 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf\": container with ID starting with 774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf not found: ID does not exist" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928481 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf"} err="failed to get container status \"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf\": rpc error: code = NotFound desc = could not find container \"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf\": container with ID starting with 774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928503 4773 scope.go:117] "RemoveContainer" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.928844 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c\": container with ID starting with 9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c not found: ID does not exist" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928867 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c"} err="failed to get container status \"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c\": rpc error: code = NotFound desc = could not find container \"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c\": container with ID starting with 9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.956309 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.013400 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.025530 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035794 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035814 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035831 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035838 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035850 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035856 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035870 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035875 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036122 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036142 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036150 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036172 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.039481 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.044497 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.044525 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.044767 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.045965 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161486 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161562 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161671 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161844 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.162168 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.263890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264523 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264547 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264634 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.265641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.266017 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.269390 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.269709 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.271397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.271410 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.271540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.286448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.359120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.799261 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.365784 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.466276 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" path="/var/lib/kubelet/pods/4b106f16-e8b7-4cc5-a5be-fba349150373/volumes" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495389 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495474 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.496679 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs" (OuterVolumeSpecName: "logs") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.500832 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x" (OuterVolumeSpecName: "kube-api-access-smh5x") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "kube-api-access-smh5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.524262 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data" (OuterVolumeSpecName: "config-data") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.536162 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597080 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597109 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597122 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597131 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.697696 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.697743 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"69446a8d0d2a42de6e148590cfbcb0a1f5f08dfbfef8edbc94698b1b5257bf49"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700086 4773 generic.go:334] "Generic (PLEG): container finished" podID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" exitCode=0 Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700136 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerDied","Data":"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerDied","Data":"4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700162 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700168 4773 scope.go:117] "RemoveContainer" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.720749 4773 scope.go:117] "RemoveContainer" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.727351 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.735595 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.751807 4773 scope.go:117] "RemoveContainer" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.752292 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041\": container with ID starting with f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041 not found: ID does not exist" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.752336 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041"} err="failed to get container status \"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041\": rpc error: code = NotFound desc = could not find container \"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041\": container with ID starting with f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041 not found: ID does not exist" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.752363 4773 scope.go:117] "RemoveContainer" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.753120 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25\": container with ID starting with b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25 not found: ID does not exist" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.753141 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25"} err="failed to get container status \"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25\": rpc error: code = NotFound desc = could not find container \"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25\": container with ID starting with b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25 not found: ID does not exist" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760140 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.760628 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760650 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.760691 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760700 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760916 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760959 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.761884 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.767879 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.776283 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.776913 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.777161 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903436 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903564 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903620 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005187 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005636 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005672 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.006218 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.009643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.009679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.010110 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.010397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.026275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.089837 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.152573 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.594269 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.717837 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerStarted","Data":"be5b8861809388d82da03a233f610c1c0601dcd3da676a63a2eea6fbb34fe0ac"} Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.721448 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.460492 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" path="/var/lib/kubelet/pods/79f3317e-9e4d-442d-a5b2-d9633262f332/volumes" Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.735073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerStarted","Data":"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.735324 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerStarted","Data":"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.738666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.763552 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7635358979999998 podStartE2EDuration="2.763535898s" podCreationTimestamp="2026-01-20 18:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:45.755353683 +0000 UTC m=+1298.677166717" watchObservedRunningTime="2026-01-20 18:51:45.763535898 +0000 UTC m=+1298.685348922" Jan 20 18:51:46 crc kubenswrapper[4773]: I0120 18:51:46.766069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6"} Jan 20 18:51:46 crc kubenswrapper[4773]: I0120 18:51:46.766720 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:51:46 crc kubenswrapper[4773]: I0120 18:51:46.793184 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.207076082 podStartE2EDuration="4.793165353s" podCreationTimestamp="2026-01-20 18:51:42 +0000 UTC" firstStartedPulling="2026-01-20 18:51:42.80452896 +0000 UTC m=+1295.726341984" lastFinishedPulling="2026-01-20 18:51:46.390618221 +0000 UTC m=+1299.312431255" observedRunningTime="2026-01-20 18:51:46.786729489 +0000 UTC m=+1299.708542523" watchObservedRunningTime="2026-01-20 18:51:46.793165353 +0000 UTC m=+1299.714978377" Jan 20 18:51:47 crc kubenswrapper[4773]: I0120 18:51:47.638147 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:47 crc kubenswrapper[4773]: I0120 18:51:47.702403 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:51:47 crc kubenswrapper[4773]: I0120 18:51:47.702768 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" containerID="cri-o://19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" gracePeriod=10 Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.202785 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274225 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274429 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274458 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.285724 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz" (OuterVolumeSpecName: "kube-api-access-zmxnz") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "kube-api-access-zmxnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.330768 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.331711 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config" (OuterVolumeSpecName: "config") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.333735 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.339271 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.376672 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.376893 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.377013 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.377111 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.377182 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788006 4773 generic.go:334] "Generic (PLEG): container finished" podID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" exitCode=0 Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788089 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788109 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerDied","Data":"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1"} Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788518 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerDied","Data":"a26dde9426d7bf6401c218a90f41c5f6ca8484f70a01b6ffde63301f200a5f7e"} Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788542 4773 scope.go:117] "RemoveContainer" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.827921 4773 scope.go:117] "RemoveContainer" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.834464 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.841918 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862208 4773 scope.go:117] "RemoveContainer" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" Jan 20 18:51:48 crc kubenswrapper[4773]: E0120 18:51:48.862622 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1\": container with ID starting with 19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1 not found: ID does not exist" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862669 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1"} err="failed to get container status \"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1\": rpc error: code = NotFound desc = could not find container \"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1\": container with ID starting with 19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1 not found: ID does not exist" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862695 4773 scope.go:117] "RemoveContainer" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" Jan 20 18:51:48 crc kubenswrapper[4773]: E0120 18:51:48.862895 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263\": container with ID starting with deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263 not found: ID does not exist" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862984 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263"} err="failed to get container status \"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263\": rpc error: code = NotFound desc = could not find container \"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263\": container with ID starting with deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263 not found: ID does not exist" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.152462 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.169018 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.456547 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" path="/var/lib/kubelet/pods/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc/volumes" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.817330 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964177 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 18:51:49 crc kubenswrapper[4773]: E0120 18:51:49.964666 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964689 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" Jan 20 18:51:49 crc kubenswrapper[4773]: E0120 18:51:49.964714 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="init" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964723 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="init" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964960 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.965771 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.971331 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.973224 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.977959 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.107884 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.108470 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.108570 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.108633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210412 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.217526 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.219528 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.225683 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.238658 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.295675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.709409 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 18:51:50 crc kubenswrapper[4773]: W0120 18:51:50.711221 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24c1bc90_8fe0_41b4_a7ba_7e15bc787386.slice/crio-9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad WatchSource:0}: Error finding container 9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad: Status 404 returned error can't find the container with id 9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.806475 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerStarted","Data":"9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad"} Jan 20 18:51:51 crc kubenswrapper[4773]: I0120 18:51:51.815137 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerStarted","Data":"7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0"} Jan 20 18:51:51 crc kubenswrapper[4773]: I0120 18:51:51.852401 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-b4hvr" podStartSLOduration=2.85238389 podStartE2EDuration="2.85238389s" podCreationTimestamp="2026-01-20 18:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:51.844575374 +0000 UTC m=+1304.766388418" watchObservedRunningTime="2026-01-20 18:51:51.85238389 +0000 UTC m=+1304.774196914" Jan 20 18:51:54 crc kubenswrapper[4773]: I0120 18:51:54.091075 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:54 crc kubenswrapper[4773]: I0120 18:51:54.091563 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:55 crc kubenswrapper[4773]: I0120 18:51:55.106158 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:55 crc kubenswrapper[4773]: I0120 18:51:55.106177 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:56 crc kubenswrapper[4773]: I0120 18:51:56.864258 4773 generic.go:334] "Generic (PLEG): container finished" podID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerID="7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0" exitCode=0 Jan 20 18:51:56 crc kubenswrapper[4773]: I0120 18:51:56.864308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerDied","Data":"7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0"} Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.169712 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.170135 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.220547 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.358135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.358484 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.358694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.359411 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.364847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts" (OuterVolumeSpecName: "scripts") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.365159 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp" (OuterVolumeSpecName: "kube-api-access-s2krp") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "kube-api-access-s2krp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.389136 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.393472 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data" (OuterVolumeSpecName: "config-data") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462089 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462380 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462448 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462501 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.880261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerDied","Data":"9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad"} Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.880304 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.880581 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.070564 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.071162 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" containerID="cri-o://5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.071247 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" containerID="cri-o://344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.083992 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.084261 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" containerID="cri-o://52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.122044 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.122309 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" containerID="cri-o://e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.122457 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" containerID="cri-o://f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.886754 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.887431 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.887911 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.888033 4773 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.892087 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" exitCode=143 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.892159 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerDied","Data":"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d"} Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.895676 4773 generic.go:334] "Generic (PLEG): container finished" podID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" exitCode=143 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.895765 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerDied","Data":"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660"} Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.898323 4773 generic.go:334] "Generic (PLEG): container finished" podID="8a5e1be7-c022-49b6-aa10-d23451918579" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" exitCode=0 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.898373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerDied","Data":"52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b"} Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.091595 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.193835 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"8a5e1be7-c022-49b6-aa10-d23451918579\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.193917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"8a5e1be7-c022-49b6-aa10-d23451918579\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.194034 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"8a5e1be7-c022-49b6-aa10-d23451918579\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.203607 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl" (OuterVolumeSpecName: "kube-api-access-n4rnl") pod "8a5e1be7-c022-49b6-aa10-d23451918579" (UID: "8a5e1be7-c022-49b6-aa10-d23451918579"). InnerVolumeSpecName "kube-api-access-n4rnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.222683 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a5e1be7-c022-49b6-aa10-d23451918579" (UID: "8a5e1be7-c022-49b6-aa10-d23451918579"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.227153 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data" (OuterVolumeSpecName: "config-data") pod "8a5e1be7-c022-49b6-aa10-d23451918579" (UID: "8a5e1be7-c022-49b6-aa10-d23451918579"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.296293 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.296331 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.296341 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.908830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerDied","Data":"220ce0272e6c403af39eafd69462c79884cab6d380e200929ec876ec12a03a06"} Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.908905 4773 scope.go:117] "RemoveContainer" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.908908 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.940278 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.948197 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.959473 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:00 crc kubenswrapper[4773]: E0120 18:52:00.959863 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.959884 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:52:00 crc kubenswrapper[4773]: E0120 18:52:00.959896 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerName="nova-manage" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.959903 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerName="nova-manage" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.960098 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.960130 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerName="nova-manage" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.960732 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.967361 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.977671 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.111915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7w9\" (UniqueName: \"kubernetes.io/projected/8413ef33-749f-4413-9965-fd19ad70ebfc-kube-api-access-qx7w9\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.111997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-config-data\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.112059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.213752 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7w9\" (UniqueName: \"kubernetes.io/projected/8413ef33-749f-4413-9965-fd19ad70ebfc-kube-api-access-qx7w9\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.216701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-config-data\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.216815 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.228086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.229263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-config-data\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.230996 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7w9\" (UniqueName: \"kubernetes.io/projected/8413ef33-749f-4413-9965-fd19ad70ebfc-kube-api-access-qx7w9\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.283191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.459454 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" path="/var/lib/kubelet/pods/8a5e1be7-c022-49b6-aa10-d23451918579/volumes" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.731135 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.922500 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8413ef33-749f-4413-9965-fd19ad70ebfc","Type":"ContainerStarted","Data":"56d14f0a4aa486f66422e52f4881e665155c3c059013618e9770c8ffd9037759"} Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.922546 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8413ef33-749f-4413-9965-fd19ad70ebfc","Type":"ContainerStarted","Data":"ff098fe1dd4f3d2186d0173403cbe18aab23828ba5d4ac096d781bffd65b8b9b"} Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.938008 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.937993439 podStartE2EDuration="1.937993439s" podCreationTimestamp="2026-01-20 18:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:01.936807331 +0000 UTC m=+1314.858620355" watchObservedRunningTime="2026-01-20 18:52:01.937993439 +0000 UTC m=+1314.859806463" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.259017 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:35564->10.217.0.184:8775: read: connection reset by peer" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.259061 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:35576->10.217.0.184:8775: read: connection reset by peer" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.640721 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.646779 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747020 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747076 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747209 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747252 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747313 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747375 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747416 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747509 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs" (OuterVolumeSpecName: "logs") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.748234 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs" (OuterVolumeSpecName: "logs") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.753811 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z" (OuterVolumeSpecName: "kube-api-access-tgw2z") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "kube-api-access-tgw2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.755357 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc" (OuterVolumeSpecName: "kube-api-access-cmtxc") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "kube-api-access-cmtxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.779670 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data" (OuterVolumeSpecName: "config-data") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.784122 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data" (OuterVolumeSpecName: "config-data") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.795149 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.798896 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.809875 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.810318 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.813021 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850872 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850908 4773 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850923 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850955 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850965 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850973 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850984 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850995 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.851004 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.851012 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.851021 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934056 4773 generic.go:334] "Generic (PLEG): container finished" podID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" exitCode=0 Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934124 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerDied","Data":"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerDied","Data":"be5b8861809388d82da03a233f610c1c0601dcd3da676a63a2eea6fbb34fe0ac"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934176 4773 scope.go:117] "RemoveContainer" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934370 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.939777 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" exitCode=0 Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.940681 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.942218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerDied","Data":"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.942293 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerDied","Data":"1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.969466 4773 scope.go:117] "RemoveContainer" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.001452 4773 scope.go:117] "RemoveContainer" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.004181 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.018443 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a\": container with ID starting with 344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a not found: ID does not exist" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.018516 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a"} err="failed to get container status \"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a\": rpc error: code = NotFound desc = could not find container \"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a\": container with ID starting with 344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.018572 4773 scope.go:117] "RemoveContainer" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.019581 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660\": container with ID starting with 5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660 not found: ID does not exist" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.019638 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660"} err="failed to get container status \"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660\": rpc error: code = NotFound desc = could not find container \"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660\": container with ID starting with 5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660 not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.019656 4773 scope.go:117] "RemoveContainer" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.035029 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.052707 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.055761 4773 scope.go:117] "RemoveContainer" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.061899 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062356 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062375 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062385 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062391 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062404 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062410 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062437 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062443 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062596 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062609 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062615 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062632 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.063709 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.066136 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.066151 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.066558 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.077448 4773 scope.go:117] "RemoveContainer" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.080694 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14\": container with ID starting with f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14 not found: ID does not exist" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.080748 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14"} err="failed to get container status \"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14\": rpc error: code = NotFound desc = could not find container \"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14\": container with ID starting with f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14 not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.080782 4773 scope.go:117] "RemoveContainer" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.081181 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d\": container with ID starting with e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d not found: ID does not exist" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.081206 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d"} err="failed to get container status \"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d\": rpc error: code = NotFound desc = could not find container \"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d\": container with ID starting with e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.082239 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.090658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.098146 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.100055 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.102161 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.102196 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.107637 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155418 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-config-data\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155462 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-public-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155487 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f890481e-0c9f-4194-8af3-d808bb105995-logs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-config-data\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155623 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm22d\" (UniqueName: \"kubernetes.io/projected/f890481e-0c9f-4194-8af3-d808bb105995-kube-api-access-cm22d\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155799 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceeec9e1-d0f5-497c-b262-2ef81be261ee-logs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155940 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.156060 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.156115 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg7kr\" (UniqueName: \"kubernetes.io/projected/ceeec9e1-d0f5-497c-b262-2ef81be261ee-kube-api-access-kg7kr\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg7kr\" (UniqueName: \"kubernetes.io/projected/ceeec9e1-d0f5-497c-b262-2ef81be261ee-kube-api-access-kg7kr\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-config-data\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-public-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258205 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f890481e-0c9f-4194-8af3-d808bb105995-logs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-config-data\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258304 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258354 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm22d\" (UniqueName: \"kubernetes.io/projected/f890481e-0c9f-4194-8af3-d808bb105995-kube-api-access-cm22d\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceeec9e1-d0f5-497c-b262-2ef81be261ee-logs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258486 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.259230 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceeec9e1-d0f5-497c-b262-2ef81be261ee-logs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.259412 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f890481e-0c9f-4194-8af3-d808bb105995-logs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262106 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262280 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-config-data\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262393 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-public-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262588 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.263126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.264064 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.266125 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-config-data\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.277516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm22d\" (UniqueName: \"kubernetes.io/projected/f890481e-0c9f-4194-8af3-d808bb105995-kube-api-access-cm22d\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.278465 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg7kr\" (UniqueName: \"kubernetes.io/projected/ceeec9e1-d0f5-497c-b262-2ef81be261ee-kube-api-access-kg7kr\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.384233 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.419643 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.458657 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" path="/var/lib/kubelet/pods/2dac51db-1574-4ccc-bb9a-7c42548d90d3/volumes" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.459569 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" path="/var/lib/kubelet/pods/fb9c6096-2ce8-4b43-a638-50374d21d621/volumes" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.846321 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.943668 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.949252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f890481e-0c9f-4194-8af3-d808bb105995","Type":"ContainerStarted","Data":"bda513a35dc2e02b11f1d8251a21deb3f98a74a2b0e988be3d86043c3f03398e"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.964233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ceeec9e1-d0f5-497c-b262-2ef81be261ee","Type":"ContainerStarted","Data":"74c4f6032a4690397bd9e01e0934cf33c43f2651856a5a89dc0042e79dcf3f7e"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.965263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ceeec9e1-d0f5-497c-b262-2ef81be261ee","Type":"ContainerStarted","Data":"f59b265793b3a2f6a3a4f1b9725ac040e95c74c1b9e0810ecddb76d61e7117d0"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.965404 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ceeec9e1-d0f5-497c-b262-2ef81be261ee","Type":"ContainerStarted","Data":"2fe6538899a038842344be0b96d2caca61ad6c599b1e50e1b6fce7db44a67138"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.966903 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f890481e-0c9f-4194-8af3-d808bb105995","Type":"ContainerStarted","Data":"8405b160f421f8567b8f3aded91e1c635e019292c890f51f0d8d7517dfbfa5cb"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.966969 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f890481e-0c9f-4194-8af3-d808bb105995","Type":"ContainerStarted","Data":"708c8111c2da7df8820e75d69d2feb4a18de77ed285fba274aa43273801e0fa5"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.988400 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.988378042 podStartE2EDuration="1.988378042s" podCreationTimestamp="2026-01-20 18:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:04.982094912 +0000 UTC m=+1317.903907976" watchObservedRunningTime="2026-01-20 18:52:04.988378042 +0000 UTC m=+1317.910191066" Jan 20 18:52:05 crc kubenswrapper[4773]: I0120 18:52:05.008311 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.008293729 podStartE2EDuration="3.008293729s" podCreationTimestamp="2026-01-20 18:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:05.003419453 +0000 UTC m=+1317.925232477" watchObservedRunningTime="2026-01-20 18:52:05.008293729 +0000 UTC m=+1317.930106753" Jan 20 18:52:06 crc kubenswrapper[4773]: I0120 18:52:06.284083 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 20 18:52:08 crc kubenswrapper[4773]: I0120 18:52:08.420788 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:52:08 crc kubenswrapper[4773]: I0120 18:52:08.421865 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:52:11 crc kubenswrapper[4773]: I0120 18:52:11.284659 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 20 18:52:11 crc kubenswrapper[4773]: I0120 18:52:11.309053 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 20 18:52:12 crc kubenswrapper[4773]: I0120 18:52:12.044614 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 20 18:52:12 crc kubenswrapper[4773]: I0120 18:52:12.366582 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.385080 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.385417 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.420514 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.420576 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.399156 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f890481e-0c9f-4194-8af3-d808bb105995" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.399198 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f890481e-0c9f-4194-8af3-d808bb105995" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.443735 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ceeec9e1-d0f5-497c-b262-2ef81be261ee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.443810 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ceeec9e1-d0f5-497c-b262-2ef81be261ee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.395530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.397696 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.399839 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.405511 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.432426 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.438095 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.441352 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:52:24 crc kubenswrapper[4773]: I0120 18:52:24.144052 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:52:24 crc kubenswrapper[4773]: I0120 18:52:24.150496 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:52:24 crc kubenswrapper[4773]: I0120 18:52:24.151305 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:52:28 crc kubenswrapper[4773]: I0120 18:52:28.170179 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:52:28 crc kubenswrapper[4773]: I0120 18:52:28.170494 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:52:32 crc kubenswrapper[4773]: I0120 18:52:32.485491 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:33 crc kubenswrapper[4773]: I0120 18:52:33.441953 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:36 crc kubenswrapper[4773]: I0120 18:52:36.707206 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" containerID="cri-o://c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" gracePeriod=604796 Jan 20 18:52:37 crc kubenswrapper[4773]: I0120 18:52:37.964861 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" containerID="cri-o://7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897" gracePeriod=604796 Jan 20 18:52:42 crc kubenswrapper[4773]: I0120 18:52:42.616561 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 20 18:52:42 crc kubenswrapper[4773]: I0120 18:52:42.627397 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.307416 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344854 4773 generic.go:334] "Generic (PLEG): container finished" podID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" exitCode=0 Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344899 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerDied","Data":"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647"} Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344944 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerDied","Data":"9437201a24daa22de36ef5e4cb32d33d9216523028488aa287392d8e49c9e78c"} Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344969 4773 scope.go:117] "RemoveContainer" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.345109 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.369629 4773 scope.go:117] "RemoveContainer" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.401676 4773 scope.go:117] "RemoveContainer" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.401951 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.401978 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402002 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402033 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402080 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402107 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402187 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402207 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402252 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402851 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.404376 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647\": container with ID starting with c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647 not found: ID does not exist" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.404436 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647"} err="failed to get container status \"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647\": rpc error: code = NotFound desc = could not find container \"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647\": container with ID starting with c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647 not found: ID does not exist" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.404470 4773 scope.go:117] "RemoveContainer" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.405771 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.407776 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f\": container with ID starting with 1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f not found: ID does not exist" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.409873 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f"} err="failed to get container status \"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f\": rpc error: code = NotFound desc = could not find container \"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f\": container with ID starting with 1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f not found: ID does not exist" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.408538 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.410185 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl" (OuterVolumeSpecName: "kube-api-access-ltrtl") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "kube-api-access-ltrtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.411052 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info" (OuterVolumeSpecName: "pod-info") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.417072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.431715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.433239 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.483498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data" (OuterVolumeSpecName: "config-data") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.501445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf" (OuterVolumeSpecName: "server-conf") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503316 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503345 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503356 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503364 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503373 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503381 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503388 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503398 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503416 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503424 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.522679 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.571344 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.605011 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.605043 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.683155 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.690335 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.707433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.709245 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.709376 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.709481 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="setup-container" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.709567 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="setup-container" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.709885 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.711194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.717439 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720272 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6z6h4" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720284 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720274 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720290 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720564 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720600 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.733834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910863 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhx8m\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-kube-api-access-fhx8m\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910896 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910953 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911001 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911022 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911060 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-config-data\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911117 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012712 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012730 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhx8m\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-kube-api-access-fhx8m\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012760 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012787 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012856 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012874 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-config-data\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.013863 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-config-data\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.014502 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.014899 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.015945 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.016641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.016840 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.017825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.018281 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.022653 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.025814 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.029611 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhx8m\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-kube-api-access-fhx8m\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.063450 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.332772 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.784185 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:45 crc kubenswrapper[4773]: I0120 18:52:45.381453 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerStarted","Data":"1edf1fc14b34e2adc460b1f25254486fa6076733fa3f6b1dec6d234588f8c56f"} Jan 20 18:52:45 crc kubenswrapper[4773]: I0120 18:52:45.460558 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" path="/var/lib/kubelet/pods/d4dfff97-df7d-498f-9203-9c2cb0d84667/volumes" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.121688 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.134445 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.136947 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.140037 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302461 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302626 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302735 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302917 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404479 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.405584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.406663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.407214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.407842 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.408397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.415774 4773 generic.go:334] "Generic (PLEG): container finished" podID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerID="7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897" exitCode=0 Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.415828 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerDied","Data":"7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897"} Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.417197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerStarted","Data":"418d0281e7537f13466deec1b8302c4b39134fd0b052576e8914257e428b8ed2"} Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.430226 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.456645 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.536274 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709523 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709616 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709643 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709795 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709861 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709895 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709924 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709975 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.711219 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.714073 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.715283 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.716107 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.716212 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.730438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.731210 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info" (OuterVolumeSpecName: "pod-info") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.734150 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw" (OuterVolumeSpecName: "kube-api-access-wj6nw") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "kube-api-access-wj6nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.746585 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data" (OuterVolumeSpecName: "config-data") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.788536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf" (OuterVolumeSpecName: "server-conf") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812464 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812501 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812524 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812533 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812541 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812551 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812561 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812570 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812580 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812588 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.834199 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.839011 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.909416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:52:47 crc kubenswrapper[4773]: W0120 18:52:47.911471 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab58995_9c7e_426a_af86_8c1493d3c8d3.slice/crio-0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a WatchSource:0}: Error finding container 0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a: Status 404 returned error can't find the container with id 0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.914682 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.914708 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.427279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerDied","Data":"81b5a2b92f1105f5c420453ca19111fe1ca35ac9507a3ac978f1c848d16b5b05"} Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.427344 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.427350 4773 scope.go:117] "RemoveContainer" containerID="7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.430733 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerStarted","Data":"0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a"} Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.468262 4773 scope.go:117] "RemoveContainer" containerID="582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.487942 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.500686 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.510663 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: E0120 18:52:48.511107 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.511126 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" Jan 20 18:52:48 crc kubenswrapper[4773]: E0120 18:52:48.511142 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="setup-container" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.511148 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="setup-container" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.511324 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.512353 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.514752 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515218 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515416 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515571 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pbqbk" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.517247 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.517494 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.573761 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624308 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35926f65-848d-4db5-b50a-deef510ce4be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbddj\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-kube-api-access-mbddj\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624387 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35926f65-848d-4db5-b50a-deef510ce4be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624416 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624429 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726283 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35926f65-848d-4db5-b50a-deef510ce4be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbddj\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-kube-api-access-mbddj\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726372 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35926f65-848d-4db5-b50a-deef510ce4be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727189 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727307 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727431 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727463 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.728026 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.728148 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.732432 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35926f65-848d-4db5-b50a-deef510ce4be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.734250 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.734692 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35926f65-848d-4db5-b50a-deef510ce4be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.739085 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.745809 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbddj\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-kube-api-access-mbddj\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.761582 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.907430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:49 crc kubenswrapper[4773]: I0120 18:52:49.457654 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" path="/var/lib/kubelet/pods/b357137a-6e30-4ed9-a440-c9f3e90f75d8/volumes" Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.233003 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:50 crc kubenswrapper[4773]: W0120 18:52:50.233674 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35926f65_848d_4db5_b50a_deef510ce4be.slice/crio-d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63 WatchSource:0}: Error finding container d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63: Status 404 returned error can't find the container with id d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63 Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.471372 4773 generic.go:334] "Generic (PLEG): container finished" podID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" exitCode=0 Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.473347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerDied","Data":"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb"} Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.476471 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerStarted","Data":"d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63"} Jan 20 18:52:51 crc kubenswrapper[4773]: I0120 18:52:51.485860 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerStarted","Data":"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0"} Jan 20 18:52:51 crc kubenswrapper[4773]: I0120 18:52:51.486313 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:51 crc kubenswrapper[4773]: I0120 18:52:51.504179 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" podStartSLOduration=4.504162007 podStartE2EDuration="4.504162007s" podCreationTimestamp="2026-01-20 18:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:51.502653971 +0000 UTC m=+1364.424466995" watchObservedRunningTime="2026-01-20 18:52:51.504162007 +0000 UTC m=+1364.425975031" Jan 20 18:52:52 crc kubenswrapper[4773]: I0120 18:52:52.494900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerStarted","Data":"87f37741dad884ad3f582962088de19df0c82ee8e7e843bdb0ffb8ddabb1883f"} Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.457729 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.519795 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.520104 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" containerID="cri-o://ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" gracePeriod=10 Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.637115 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.188:5353: connect: connection refused" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.661294 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.663543 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.672370 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.697970 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698570 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698626 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.801803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802385 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802433 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802456 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803343 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.823907 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.038557 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.170656 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171042 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171093 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171801 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171861 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb" gracePeriod=600 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.453894 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.511094 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 18:52:58 crc kubenswrapper[4773]: W0120 18:52:58.514753 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba4ab073_f712_41fb_9b44_d83a19b72973.slice/crio-22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5 WatchSource:0}: Error finding container 22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5: Status 404 returned error can't find the container with id 22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523034 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523118 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523327 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523366 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.531090 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl" (OuterVolumeSpecName: "kube-api-access-l6sxl") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "kube-api-access-l6sxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.563349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerStarted","Data":"22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.565122 4773 generic.go:334] "Generic (PLEG): container finished" podID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" exitCode=0 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.565279 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.566245 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerDied","Data":"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.566282 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerDied","Data":"d35e4b0ceb4787bad4a95ece01b001352569b9bbc51780aec8a8c24c4fa207e2"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.566312 4773 scope.go:117] "RemoveContainer" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.573508 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb" exitCode=0 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.573553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.573585 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.581810 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.590352 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config" (OuterVolumeSpecName: "config") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.590613 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.595872 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.623761 4773 scope.go:117] "RemoveContainer" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628381 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628415 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628436 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628450 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628461 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.709554 4773 scope.go:117] "RemoveContainer" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.710267 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7\": container with ID starting with ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7 not found: ID does not exist" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.710317 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7"} err="failed to get container status \"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7\": rpc error: code = NotFound desc = could not find container \"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7\": container with ID starting with ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7 not found: ID does not exist" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.710345 4773 scope.go:117] "RemoveContainer" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.711176 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd\": container with ID starting with 69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd not found: ID does not exist" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.711237 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd"} err="failed to get container status \"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd\": rpc error: code = NotFound desc = could not find container \"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd\": container with ID starting with 69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd not found: ID does not exist" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.711269 4773 scope.go:117] "RemoveContainer" containerID="f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.771893 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.772293 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="init" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.772309 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="init" Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.772318 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.772325 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.772482 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.773238 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.777317 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.777578 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.779127 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.779162 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.786890 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.837913 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.838202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.838278 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.838410 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940368 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.951413 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.951644 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.959596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.960516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.961413 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.967641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.119308 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.458629 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" path="/var/lib/kubelet/pods/cb02b4c0-80ac-4860-8877-f507f8bc2028/volumes" Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.588457 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerID="f7aec563576030ac1c13e7cfb223eea2c4098b2ad34114c7a7b21eb120d4d273" exitCode=0 Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.588527 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerDied","Data":"f7aec563576030ac1c13e7cfb223eea2c4098b2ad34114c7a7b21eb120d4d273"} Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.683296 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 18:52:59 crc kubenswrapper[4773]: W0120 18:52:59.691216 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02289c77_b6e5_4419_8dc4_597648db0e01.slice/crio-eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec WatchSource:0}: Error finding container eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec: Status 404 returned error can't find the container with id eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.693942 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.604262 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerStarted","Data":"eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec"} Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.607300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerStarted","Data":"0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480"} Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.607849 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.630103 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" podStartSLOduration=3.630079628 podStartE2EDuration="3.630079628s" podCreationTimestamp="2026-01-20 18:52:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:53:00.627376344 +0000 UTC m=+1373.549189378" watchObservedRunningTime="2026-01-20 18:53:00.630079628 +0000 UTC m=+1373.551892652" Jan 20 18:53:08 crc kubenswrapper[4773]: I0120 18:53:08.040227 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:53:08 crc kubenswrapper[4773]: I0120 18:53:08.145016 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:53:08 crc kubenswrapper[4773]: I0120 18:53:08.145333 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" containerID="cri-o://978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" gracePeriod=10 Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.485697 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.679148 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.695974 4773 generic.go:334] "Generic (PLEG): container finished" podID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" exitCode=0 Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696038 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerDied","Data":"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0"} Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerDied","Data":"0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a"} Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696096 4773 scope.go:117] "RemoveContainer" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.743898 4773 scope.go:117] "RemoveContainer" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.764352 4773 scope.go:117] "RemoveContainer" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" Jan 20 18:53:09 crc kubenswrapper[4773]: E0120 18:53:09.764950 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0\": container with ID starting with 978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0 not found: ID does not exist" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.764994 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0"} err="failed to get container status \"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0\": rpc error: code = NotFound desc = could not find container \"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0\": container with ID starting with 978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0 not found: ID does not exist" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.765021 4773 scope.go:117] "RemoveContainer" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" Jan 20 18:53:09 crc kubenswrapper[4773]: E0120 18:53:09.765343 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb\": container with ID starting with 07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb not found: ID does not exist" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.765376 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb"} err="failed to get container status \"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb\": rpc error: code = NotFound desc = could not find container \"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb\": container with ID starting with 07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb not found: ID does not exist" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862522 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862637 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862783 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.868997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh" (OuterVolumeSpecName: "kube-api-access-tzxdh") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "kube-api-access-tzxdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.904259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.906812 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.911515 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config" (OuterVolumeSpecName: "config") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.920094 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.939322 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.968832 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969083 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969190 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969254 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969335 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969739 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.047361 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.055567 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.707436 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerStarted","Data":"55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384"} Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.722598 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" podStartSLOduration=2.933150977 podStartE2EDuration="12.722580042s" podCreationTimestamp="2026-01-20 18:52:58 +0000 UTC" firstStartedPulling="2026-01-20 18:52:59.693697444 +0000 UTC m=+1372.615510468" lastFinishedPulling="2026-01-20 18:53:09.483126509 +0000 UTC m=+1382.404939533" observedRunningTime="2026-01-20 18:53:10.718790862 +0000 UTC m=+1383.640603906" watchObservedRunningTime="2026-01-20 18:53:10.722580042 +0000 UTC m=+1383.644393066" Jan 20 18:53:11 crc kubenswrapper[4773]: I0120 18:53:11.458456 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" path="/var/lib/kubelet/pods/dab58995-9c7e-426a-af86-8c1493d3c8d3/volumes" Jan 20 18:53:18 crc kubenswrapper[4773]: I0120 18:53:18.778372 4773 generic.go:334] "Generic (PLEG): container finished" podID="375735e1-5d2a-4cc8-892b-4bdcdf9f1e42" containerID="418d0281e7537f13466deec1b8302c4b39134fd0b052576e8914257e428b8ed2" exitCode=0 Jan 20 18:53:18 crc kubenswrapper[4773]: I0120 18:53:18.778508 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerDied","Data":"418d0281e7537f13466deec1b8302c4b39134fd0b052576e8914257e428b8ed2"} Jan 20 18:53:19 crc kubenswrapper[4773]: I0120 18:53:19.792437 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerStarted","Data":"e2d8b223c68401e4f892191eb30679316dc5a26d5f2beba30b7c02008ae6195a"} Jan 20 18:53:19 crc kubenswrapper[4773]: I0120 18:53:19.793298 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 20 18:53:19 crc kubenswrapper[4773]: I0120 18:53:19.817259 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.8172348 podStartE2EDuration="36.8172348s" podCreationTimestamp="2026-01-20 18:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:53:19.813696326 +0000 UTC m=+1392.735509360" watchObservedRunningTime="2026-01-20 18:53:19.8172348 +0000 UTC m=+1392.739047824" Jan 20 18:53:20 crc kubenswrapper[4773]: I0120 18:53:20.801175 4773 generic.go:334] "Generic (PLEG): container finished" podID="02289c77-b6e5-4419-8dc4-597648db0e01" containerID="55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384" exitCode=0 Jan 20 18:53:20 crc kubenswrapper[4773]: I0120 18:53:20.801715 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerDied","Data":"55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384"} Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.196617 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.294731 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.294910 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.295103 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.295133 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.308328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.308355 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59" (OuterVolumeSpecName: "kube-api-access-pvb59") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "kube-api-access-pvb59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.324134 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory" (OuterVolumeSpecName: "inventory") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.335138 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397492 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397528 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397538 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397547 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.821249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerDied","Data":"eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec"} Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.821294 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.821294 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.888699 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 18:53:22 crc kubenswrapper[4773]: E0120 18:53:22.889070 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889084 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: E0120 18:53:22.889101 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889107 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" Jan 20 18:53:22 crc kubenswrapper[4773]: E0120 18:53:22.889128 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="init" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889134 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="init" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889282 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889296 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889861 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892295 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892306 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892472 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892511 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.918833 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010941 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.112461 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.112535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.112566 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.113284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.116529 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.116918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.118040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.141303 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.218400 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.728244 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.833175 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerStarted","Data":"b53c5213d4b438728dd937da04ea956d9066ef8724ab70bbeae3ff997481a9d6"} Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.835412 4773 generic.go:334] "Generic (PLEG): container finished" podID="35926f65-848d-4db5-b50a-deef510ce4be" containerID="87f37741dad884ad3f582962088de19df0c82ee8e7e843bdb0ffb8ddabb1883f" exitCode=0 Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.835494 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerDied","Data":"87f37741dad884ad3f582962088de19df0c82ee8e7e843bdb0ffb8ddabb1883f"} Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.845855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerStarted","Data":"ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1"} Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.848782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerStarted","Data":"5056057c1e78e0ee19930122cdad3818102dc9ef9fb981a2ffd2a8d11e61ab6e"} Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.849098 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.868751 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" podStartSLOduration=2.4265908339999998 podStartE2EDuration="2.868738632s" podCreationTimestamp="2026-01-20 18:53:22 +0000 UTC" firstStartedPulling="2026-01-20 18:53:23.738059336 +0000 UTC m=+1396.659872360" lastFinishedPulling="2026-01-20 18:53:24.180207134 +0000 UTC m=+1397.102020158" observedRunningTime="2026-01-20 18:53:24.860741072 +0000 UTC m=+1397.782554096" watchObservedRunningTime="2026-01-20 18:53:24.868738632 +0000 UTC m=+1397.790551646" Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.889615 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.889599809 podStartE2EDuration="36.889599809s" podCreationTimestamp="2026-01-20 18:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:53:24.88671194 +0000 UTC m=+1397.808524984" watchObservedRunningTime="2026-01-20 18:53:24.889599809 +0000 UTC m=+1397.811412833" Jan 20 18:53:34 crc kubenswrapper[4773]: I0120 18:53:34.336083 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 20 18:53:38 crc kubenswrapper[4773]: I0120 18:53:38.911233 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.762461 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.765449 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.778666 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.877275 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.877343 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.877382 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.979312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.979381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.979429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.980497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.980718 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.999111 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:42 crc kubenswrapper[4773]: I0120 18:53:42.087476 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:42 crc kubenswrapper[4773]: I0120 18:53:42.560625 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:53:43 crc kubenswrapper[4773]: I0120 18:53:43.053073 4773 generic.go:334] "Generic (PLEG): container finished" podID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerID="669f605e449c86e68ef6da15acaef216606728ad06e5b08f31210322560a5191" exitCode=0 Jan 20 18:53:43 crc kubenswrapper[4773]: I0120 18:53:43.053128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"669f605e449c86e68ef6da15acaef216606728ad06e5b08f31210322560a5191"} Jan 20 18:53:43 crc kubenswrapper[4773]: I0120 18:53:43.053351 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerStarted","Data":"a28dca4f7f445301eea149b36d0aab07434e48fff601378c8c14b9f9e5ae5a54"} Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.070480 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerStarted","Data":"76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0"} Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.952690 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.955264 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.962917 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.052378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.052494 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.052588 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.153998 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154152 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154799 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.188558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.289122 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.806775 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:47 crc kubenswrapper[4773]: I0120 18:53:47.084754 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerStarted","Data":"030c4e3fa1061815a067dd4271885fdcb782aadd09e4a899d330c43028fd9dd4"} Jan 20 18:53:47 crc kubenswrapper[4773]: I0120 18:53:47.087101 4773 generic.go:334] "Generic (PLEG): container finished" podID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerID="76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0" exitCode=0 Jan 20 18:53:47 crc kubenswrapper[4773]: I0120 18:53:47.087131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0"} Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.103867 4773 generic.go:334] "Generic (PLEG): container finished" podID="68194968-898d-49f9-a430-4732bb8122d5" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" exitCode=0 Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.103948 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532"} Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.108404 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerStarted","Data":"f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40"} Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.157734 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gntwg" podStartSLOduration=2.681690653 podStartE2EDuration="8.157706083s" podCreationTimestamp="2026-01-20 18:53:41 +0000 UTC" firstStartedPulling="2026-01-20 18:53:43.05516823 +0000 UTC m=+1415.976981254" lastFinishedPulling="2026-01-20 18:53:48.53118366 +0000 UTC m=+1421.452996684" observedRunningTime="2026-01-20 18:53:49.147701595 +0000 UTC m=+1422.069514639" watchObservedRunningTime="2026-01-20 18:53:49.157706083 +0000 UTC m=+1422.079519107" Jan 20 18:53:51 crc kubenswrapper[4773]: I0120 18:53:51.129626 4773 generic.go:334] "Generic (PLEG): container finished" podID="68194968-898d-49f9-a430-4732bb8122d5" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" exitCode=0 Jan 20 18:53:51 crc kubenswrapper[4773]: I0120 18:53:51.129741 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1"} Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.088097 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.089428 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.159994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerStarted","Data":"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2"} Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.193960 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krc65" podStartSLOduration=4.667265675 podStartE2EDuration="7.193927096s" podCreationTimestamp="2026-01-20 18:53:45 +0000 UTC" firstStartedPulling="2026-01-20 18:53:49.106232689 +0000 UTC m=+1422.028045713" lastFinishedPulling="2026-01-20 18:53:51.63289411 +0000 UTC m=+1424.554707134" observedRunningTime="2026-01-20 18:53:52.184417649 +0000 UTC m=+1425.106230673" watchObservedRunningTime="2026-01-20 18:53:52.193927096 +0000 UTC m=+1425.115740120" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.361914 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.364662 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.401857 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.480566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.481136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.481339 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.582808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.583330 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.583490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.583691 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.584027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.607634 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.703294 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:53 crc kubenswrapper[4773]: I0120 18:53:53.143233 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gntwg" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" probeResult="failure" output=< Jan 20 18:53:53 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 20 18:53:53 crc kubenswrapper[4773]: > Jan 20 18:53:53 crc kubenswrapper[4773]: I0120 18:53:53.238829 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:53:53 crc kubenswrapper[4773]: W0120 18:53:53.249221 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8824f1f3_1188_4559_b46a_28cbcdb0cf7b.slice/crio-991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725 WatchSource:0}: Error finding container 991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725: Status 404 returned error can't find the container with id 991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725 Jan 20 18:53:54 crc kubenswrapper[4773]: I0120 18:53:54.180882 4773 generic.go:334] "Generic (PLEG): container finished" podID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" exitCode=0 Jan 20 18:53:54 crc kubenswrapper[4773]: I0120 18:53:54.180986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e"} Jan 20 18:53:54 crc kubenswrapper[4773]: I0120 18:53:54.181603 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerStarted","Data":"991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725"} Jan 20 18:53:55 crc kubenswrapper[4773]: I0120 18:53:55.192988 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerStarted","Data":"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8"} Jan 20 18:53:56 crc kubenswrapper[4773]: I0120 18:53:56.290182 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:56 crc kubenswrapper[4773]: I0120 18:53:56.290236 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:56 crc kubenswrapper[4773]: I0120 18:53:56.333906 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:56 crc kubenswrapper[4773]: E0120 18:53:56.416110 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8824f1f3_1188_4559_b46a_28cbcdb0cf7b.slice/crio-conmon-9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8.scope\": RecentStats: unable to find data in memory cache]" Jan 20 18:53:57 crc kubenswrapper[4773]: I0120 18:53:57.210981 4773 generic.go:334] "Generic (PLEG): container finished" podID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" exitCode=0 Jan 20 18:53:57 crc kubenswrapper[4773]: I0120 18:53:57.211032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8"} Jan 20 18:53:57 crc kubenswrapper[4773]: I0120 18:53:57.258530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:58 crc kubenswrapper[4773]: I0120 18:53:58.738140 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.230277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerStarted","Data":"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121"} Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.230466 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krc65" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" containerID="cri-o://bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" gracePeriod=2 Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.260409 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hmn8" podStartSLOduration=3.116416474 podStartE2EDuration="7.260384709s" podCreationTimestamp="2026-01-20 18:53:52 +0000 UTC" firstStartedPulling="2026-01-20 18:53:54.18425508 +0000 UTC m=+1427.106068154" lastFinishedPulling="2026-01-20 18:53:58.328223365 +0000 UTC m=+1431.250036389" observedRunningTime="2026-01-20 18:53:59.247769909 +0000 UTC m=+1432.169582953" watchObservedRunningTime="2026-01-20 18:53:59.260384709 +0000 UTC m=+1432.182197753" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.694264 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.861770 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"68194968-898d-49f9-a430-4732bb8122d5\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.862091 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"68194968-898d-49f9-a430-4732bb8122d5\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.862139 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"68194968-898d-49f9-a430-4732bb8122d5\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.862982 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities" (OuterVolumeSpecName: "utilities") pod "68194968-898d-49f9-a430-4732bb8122d5" (UID: "68194968-898d-49f9-a430-4732bb8122d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.867589 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf" (OuterVolumeSpecName: "kube-api-access-mhmsf") pod "68194968-898d-49f9-a430-4732bb8122d5" (UID: "68194968-898d-49f9-a430-4732bb8122d5"). InnerVolumeSpecName "kube-api-access-mhmsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.884757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68194968-898d-49f9-a430-4732bb8122d5" (UID: "68194968-898d-49f9-a430-4732bb8122d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.965409 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.965448 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.965461 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241309 4773 generic.go:334] "Generic (PLEG): container finished" podID="68194968-898d-49f9-a430-4732bb8122d5" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" exitCode=0 Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2"} Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241402 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"030c4e3fa1061815a067dd4271885fdcb782aadd09e4a899d330c43028fd9dd4"} Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241412 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241419 4773 scope.go:117] "RemoveContainer" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.269099 4773 scope.go:117] "RemoveContainer" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.297549 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.298098 4773 scope.go:117] "RemoveContainer" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.315492 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.328668 4773 scope.go:117] "RemoveContainer" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" Jan 20 18:54:00 crc kubenswrapper[4773]: E0120 18:54:00.329266 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2\": container with ID starting with bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2 not found: ID does not exist" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329303 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2"} err="failed to get container status \"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2\": rpc error: code = NotFound desc = could not find container \"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2\": container with ID starting with bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2 not found: ID does not exist" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329328 4773 scope.go:117] "RemoveContainer" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" Jan 20 18:54:00 crc kubenswrapper[4773]: E0120 18:54:00.329724 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1\": container with ID starting with 34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1 not found: ID does not exist" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329815 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1"} err="failed to get container status \"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1\": rpc error: code = NotFound desc = could not find container \"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1\": container with ID starting with 34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1 not found: ID does not exist" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329869 4773 scope.go:117] "RemoveContainer" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" Jan 20 18:54:00 crc kubenswrapper[4773]: E0120 18:54:00.330392 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532\": container with ID starting with 091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532 not found: ID does not exist" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.330436 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532"} err="failed to get container status \"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532\": rpc error: code = NotFound desc = could not find container \"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532\": container with ID starting with 091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532 not found: ID does not exist" Jan 20 18:54:01 crc kubenswrapper[4773]: I0120 18:54:01.459712 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68194968-898d-49f9-a430-4732bb8122d5" path="/var/lib/kubelet/pods/68194968-898d-49f9-a430-4732bb8122d5/volumes" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.155442 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.207887 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.705115 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.705179 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.769202 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:03 crc kubenswrapper[4773]: I0120 18:54:03.328716 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.141332 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.141892 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gntwg" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" containerID="cri-o://f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40" gracePeriod=2 Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.287521 4773 generic.go:334] "Generic (PLEG): container finished" podID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerID="f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40" exitCode=0 Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.287596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40"} Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.560882 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.755070 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"6af7e52c-fffc-47ba-88de-3340d26e02d5\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.755272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"6af7e52c-fffc-47ba-88de-3340d26e02d5\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.755320 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"6af7e52c-fffc-47ba-88de-3340d26e02d5\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.756147 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities" (OuterVolumeSpecName: "utilities") pod "6af7e52c-fffc-47ba-88de-3340d26e02d5" (UID: "6af7e52c-fffc-47ba-88de-3340d26e02d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.761863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8" (OuterVolumeSpecName: "kube-api-access-rlrt8") pod "6af7e52c-fffc-47ba-88de-3340d26e02d5" (UID: "6af7e52c-fffc-47ba-88de-3340d26e02d5"). InnerVolumeSpecName "kube-api-access-rlrt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.857656 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.857689 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.869087 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6af7e52c-fffc-47ba-88de-3340d26e02d5" (UID: "6af7e52c-fffc-47ba-88de-3340d26e02d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.959586 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.139465 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.297519 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"a28dca4f7f445301eea149b36d0aab07434e48fff601378c8c14b9f9e5ae5a54"} Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.297602 4773 scope.go:117] "RemoveContainer" containerID="f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.297709 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hmn8" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" containerID="cri-o://189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" gracePeriod=2 Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.300911 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.318609 4773 scope.go:117] "RemoveContainer" containerID="76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.337023 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.346547 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.355721 4773 scope.go:117] "RemoveContainer" containerID="669f605e449c86e68ef6da15acaef216606728ad06e5b08f31210322560a5191" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.460920 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" path="/var/lib/kubelet/pods/6af7e52c-fffc-47ba-88de-3340d26e02d5/volumes" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.724192 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.873140 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.873228 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.873338 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.874170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities" (OuterVolumeSpecName: "utilities") pod "8824f1f3-1188-4559-b46a-28cbcdb0cf7b" (UID: "8824f1f3-1188-4559-b46a-28cbcdb0cf7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.874542 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.878401 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc" (OuterVolumeSpecName: "kube-api-access-b5rxc") pod "8824f1f3-1188-4559-b46a-28cbcdb0cf7b" (UID: "8824f1f3-1188-4559-b46a-28cbcdb0cf7b"). InnerVolumeSpecName "kube-api-access-b5rxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.913325 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8824f1f3-1188-4559-b46a-28cbcdb0cf7b" (UID: "8824f1f3-1188-4559-b46a-28cbcdb0cf7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.975698 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.975726 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313207 4773 generic.go:334] "Generic (PLEG): container finished" podID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" exitCode=0 Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121"} Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313323 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313351 4773 scope.go:117] "RemoveContainer" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725"} Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.345562 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.346567 4773 scope.go:117] "RemoveContainer" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.353484 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.395806 4773 scope.go:117] "RemoveContainer" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.430563 4773 scope.go:117] "RemoveContainer" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" Jan 20 18:54:06 crc kubenswrapper[4773]: E0120 18:54:06.431096 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121\": container with ID starting with 189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121 not found: ID does not exist" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431137 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121"} err="failed to get container status \"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121\": rpc error: code = NotFound desc = could not find container \"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121\": container with ID starting with 189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121 not found: ID does not exist" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431181 4773 scope.go:117] "RemoveContainer" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" Jan 20 18:54:06 crc kubenswrapper[4773]: E0120 18:54:06.431595 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8\": container with ID starting with 9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8 not found: ID does not exist" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431641 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8"} err="failed to get container status \"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8\": rpc error: code = NotFound desc = could not find container \"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8\": container with ID starting with 9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8 not found: ID does not exist" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431658 4773 scope.go:117] "RemoveContainer" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" Jan 20 18:54:06 crc kubenswrapper[4773]: E0120 18:54:06.431945 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e\": container with ID starting with e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e not found: ID does not exist" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431970 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e"} err="failed to get container status \"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e\": rpc error: code = NotFound desc = could not find container \"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e\": container with ID starting with e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e not found: ID does not exist" Jan 20 18:54:07 crc kubenswrapper[4773]: I0120 18:54:07.462613 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" path="/var/lib/kubelet/pods/8824f1f3-1188-4559-b46a-28cbcdb0cf7b/volumes" Jan 20 18:54:17 crc kubenswrapper[4773]: I0120 18:54:17.752555 4773 scope.go:117] "RemoveContainer" containerID="fe986dbc9aa7abb1946cbbaf36610eba367f6b9655e2f6cf2645119cfbe827cd" Jan 20 18:54:58 crc kubenswrapper[4773]: I0120 18:54:58.170137 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:54:58 crc kubenswrapper[4773]: I0120 18:54:58.170980 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.859826 4773 scope.go:117] "RemoveContainer" containerID="5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.886195 4773 scope.go:117] "RemoveContainer" containerID="cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.914471 4773 scope.go:117] "RemoveContainer" containerID="ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.962303 4773 scope.go:117] "RemoveContainer" containerID="fba20ba934791c753f7de5893c3aaef399510fc1a1206ee1163905e05a43e6b4" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.991269 4773 scope.go:117] "RemoveContainer" containerID="a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9" Jan 20 18:55:28 crc kubenswrapper[4773]: I0120 18:55:28.172107 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:55:28 crc kubenswrapper[4773]: I0120 18:55:28.172913 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.169919 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.170562 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.170610 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.171432 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.171493 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" gracePeriod=600 Jan 20 18:55:58 crc kubenswrapper[4773]: E0120 18:55:58.293592 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.256056 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" exitCode=0 Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.256177 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630"} Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.256497 4773 scope.go:117] "RemoveContainer" containerID="14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb" Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.257058 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:55:59 crc kubenswrapper[4773]: E0120 18:55:59.257407 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:12 crc kubenswrapper[4773]: I0120 18:56:12.447707 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:12 crc kubenswrapper[4773]: E0120 18:56:12.448481 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:24 crc kubenswrapper[4773]: I0120 18:56:24.447222 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:24 crc kubenswrapper[4773]: E0120 18:56:24.449243 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.625599 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.627723 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628153 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628239 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628296 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628373 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628432 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628493 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628549 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628614 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628666 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628726 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628858 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628920 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629006 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.629070 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629141 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.629204 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629265 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629523 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629589 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629647 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.631273 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.655478 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.773842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.774087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.774202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.875404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.875475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.875541 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.876010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.876177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.895743 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.955952 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:33 crc kubenswrapper[4773]: I0120 18:56:33.444922 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:33 crc kubenswrapper[4773]: I0120 18:56:33.535704 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerStarted","Data":"c54f0307e49dba271d8b05075aa14d69a2b8b883abaff80426c0b2b66837b5eb"} Jan 20 18:56:34 crc kubenswrapper[4773]: I0120 18:56:34.560978 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" exitCode=0 Jan 20 18:56:34 crc kubenswrapper[4773]: I0120 18:56:34.561323 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f"} Jan 20 18:56:35 crc kubenswrapper[4773]: I0120 18:56:35.446944 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:35 crc kubenswrapper[4773]: E0120 18:56:35.447452 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:38 crc kubenswrapper[4773]: I0120 18:56:38.597659 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" exitCode=0 Jan 20 18:56:38 crc kubenswrapper[4773]: I0120 18:56:38.597732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a"} Jan 20 18:56:39 crc kubenswrapper[4773]: I0120 18:56:39.607270 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerStarted","Data":"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812"} Jan 20 18:56:39 crc kubenswrapper[4773]: I0120 18:56:39.629701 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bph6d" podStartSLOduration=3.08771012 podStartE2EDuration="7.629680452s" podCreationTimestamp="2026-01-20 18:56:32 +0000 UTC" firstStartedPulling="2026-01-20 18:56:34.564289489 +0000 UTC m=+1587.486102513" lastFinishedPulling="2026-01-20 18:56:39.106259821 +0000 UTC m=+1592.028072845" observedRunningTime="2026-01-20 18:56:39.625051792 +0000 UTC m=+1592.546864816" watchObservedRunningTime="2026-01-20 18:56:39.629680452 +0000 UTC m=+1592.551493466" Jan 20 18:56:42 crc kubenswrapper[4773]: I0120 18:56:42.956112 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:42 crc kubenswrapper[4773]: I0120 18:56:42.956769 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:43 crc kubenswrapper[4773]: I0120 18:56:43.017874 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:46 crc kubenswrapper[4773]: I0120 18:56:46.447137 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:46 crc kubenswrapper[4773]: E0120 18:56:46.447816 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.003285 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.046918 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.750312 4773 generic.go:334] "Generic (PLEG): container finished" podID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerID="ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1" exitCode=0 Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.750566 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bph6d" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" containerID="cri-o://36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" gracePeriod=2 Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.750678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerDied","Data":"ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1"} Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.216159 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.411672 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.412017 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.412218 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.413568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities" (OuterVolumeSpecName: "utilities") pod "c7936be6-fe56-42d1-a86d-c2d3dd3718df" (UID: "c7936be6-fe56-42d1-a86d-c2d3dd3718df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.419160 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9" (OuterVolumeSpecName: "kube-api-access-t8pd9") pod "c7936be6-fe56-42d1-a86d-c2d3dd3718df" (UID: "c7936be6-fe56-42d1-a86d-c2d3dd3718df"). InnerVolumeSpecName "kube-api-access-t8pd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.461812 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7936be6-fe56-42d1-a86d-c2d3dd3718df" (UID: "c7936be6-fe56-42d1-a86d-c2d3dd3718df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.515526 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.515560 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.515576 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760149 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" exitCode=0 Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760207 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760203 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812"} Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760663 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"c54f0307e49dba271d8b05075aa14d69a2b8b883abaff80426c0b2b66837b5eb"} Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760692 4773 scope.go:117] "RemoveContainer" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.784437 4773 scope.go:117] "RemoveContainer" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.807555 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.818716 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.823671 4773 scope.go:117] "RemoveContainer" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.864989 4773 scope.go:117] "RemoveContainer" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" Jan 20 18:56:54 crc kubenswrapper[4773]: E0120 18:56:54.865476 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812\": container with ID starting with 36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812 not found: ID does not exist" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865521 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812"} err="failed to get container status \"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812\": rpc error: code = NotFound desc = could not find container \"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812\": container with ID starting with 36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812 not found: ID does not exist" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865549 4773 scope.go:117] "RemoveContainer" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" Jan 20 18:56:54 crc kubenswrapper[4773]: E0120 18:56:54.865852 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a\": container with ID starting with 08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a not found: ID does not exist" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865879 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a"} err="failed to get container status \"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a\": rpc error: code = NotFound desc = could not find container \"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a\": container with ID starting with 08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a not found: ID does not exist" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865899 4773 scope.go:117] "RemoveContainer" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" Jan 20 18:56:54 crc kubenswrapper[4773]: E0120 18:56:54.866141 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f\": container with ID starting with 37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f not found: ID does not exist" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.866166 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f"} err="failed to get container status \"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f\": rpc error: code = NotFound desc = could not find container \"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f\": container with ID starting with 37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f not found: ID does not exist" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.135829 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330145 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330318 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330394 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330436 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.335799 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.336129 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x" (OuterVolumeSpecName: "kube-api-access-nrx7x") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "kube-api-access-nrx7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.358693 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.360674 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory" (OuterVolumeSpecName: "inventory") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432691 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432725 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432734 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432745 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.456287 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" path="/var/lib/kubelet/pods/c7936be6-fe56-42d1-a86d-c2d3dd3718df/volumes" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.772205 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.772206 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerDied","Data":"b53c5213d4b438728dd937da04ea956d9066ef8724ab70bbeae3ff997481a9d6"} Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.772341 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b53c5213d4b438728dd937da04ea956d9066ef8724ab70bbeae3ff997481a9d6" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.895520 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896225 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-content" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896244 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-content" Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896265 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-utilities" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896273 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-utilities" Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896290 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896300 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896344 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896353 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896747 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896788 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.897759 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.903954 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904123 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904339 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904578 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904786 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.947157 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.947271 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.947515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.049122 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.049210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.049279 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.053134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.053888 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.065520 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.251564 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.769156 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.781299 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerStarted","Data":"1625767637a46e5f2b3baddec627eb33e35f3d60f07f8dd4d48dfcd1b431dfef"} Jan 20 18:56:57 crc kubenswrapper[4773]: I0120 18:56:57.453466 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:57 crc kubenswrapper[4773]: E0120 18:56:57.453737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:57 crc kubenswrapper[4773]: I0120 18:56:57.790867 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerStarted","Data":"a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d"} Jan 20 18:56:57 crc kubenswrapper[4773]: I0120 18:56:57.812978 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" podStartSLOduration=2.049306407 podStartE2EDuration="2.812961272s" podCreationTimestamp="2026-01-20 18:56:55 +0000 UTC" firstStartedPulling="2026-01-20 18:56:56.77030715 +0000 UTC m=+1609.692120164" lastFinishedPulling="2026-01-20 18:56:57.533962005 +0000 UTC m=+1610.455775029" observedRunningTime="2026-01-20 18:56:57.808700591 +0000 UTC m=+1610.730513625" watchObservedRunningTime="2026-01-20 18:56:57.812961272 +0000 UTC m=+1610.734774296" Jan 20 18:57:10 crc kubenswrapper[4773]: I0120 18:57:10.446484 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:10 crc kubenswrapper[4773]: E0120 18:57:10.447194 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:57:24 crc kubenswrapper[4773]: I0120 18:57:24.447234 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:24 crc kubenswrapper[4773]: E0120 18:57:24.448046 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:57:35 crc kubenswrapper[4773]: I0120 18:57:35.448307 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:35 crc kubenswrapper[4773]: E0120 18:57:35.449077 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:57:49 crc kubenswrapper[4773]: I0120 18:57:49.448634 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:49 crc kubenswrapper[4773]: E0120 18:57:49.449840 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:02 crc kubenswrapper[4773]: I0120 18:58:02.446866 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:02 crc kubenswrapper[4773]: E0120 18:58:02.447714 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:08 crc kubenswrapper[4773]: I0120 18:58:08.036280 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:58:08 crc kubenswrapper[4773]: I0120 18:58:08.045602 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.046898 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.059335 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.077606 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.087366 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.094406 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.101204 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.108539 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.115583 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.122768 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.129613 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.466298 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" path="/var/lib/kubelet/pods/0c64cf4d-562e-4a78-a22b-d682436d5db3/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.467806 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" path="/var/lib/kubelet/pods/2ce8f955-26cb-4860-afc1-effceac1d7a4/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.469853 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" path="/var/lib/kubelet/pods/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.471464 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" path="/var/lib/kubelet/pods/484e46fc-ebda-496a-9884-295fcd065e9b/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.474556 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" path="/var/lib/kubelet/pods/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.475770 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" path="/var/lib/kubelet/pods/c6e707f5-41a8-43c6-976a-7a9645c0b0ca/volumes" Jan 20 18:58:15 crc kubenswrapper[4773]: I0120 18:58:15.126906 4773 generic.go:334] "Generic (PLEG): container finished" podID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerID="a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d" exitCode=0 Jan 20 18:58:15 crc kubenswrapper[4773]: I0120 18:58:15.126986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerDied","Data":"a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d"} Jan 20 18:58:15 crc kubenswrapper[4773]: I0120 18:58:15.447051 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:15 crc kubenswrapper[4773]: E0120 18:58:15.452752 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.562742 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.658984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.659065 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.659163 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.668186 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45" (OuterVolumeSpecName: "kube-api-access-gxp45") pod "0dd5218f-c5ee-4e0b-83bb-ab17d1887596" (UID: "0dd5218f-c5ee-4e0b-83bb-ab17d1887596"). InnerVolumeSpecName "kube-api-access-gxp45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.688312 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory" (OuterVolumeSpecName: "inventory") pod "0dd5218f-c5ee-4e0b-83bb-ab17d1887596" (UID: "0dd5218f-c5ee-4e0b-83bb-ab17d1887596"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.688705 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0dd5218f-c5ee-4e0b-83bb-ab17d1887596" (UID: "0dd5218f-c5ee-4e0b-83bb-ab17d1887596"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.760306 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.760344 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.760353 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.165782 4773 scope.go:117] "RemoveContainer" containerID="2f81f4ca58be86ce8c8a188542774e148445a3fd02682f00bd51696f895c5fe9" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.196332 4773 scope.go:117] "RemoveContainer" containerID="c06ffe1452b6a2d6f74722f0b7b71c4f2ce4d5613dc36070b3f5358e09162f2f" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.242571 4773 scope.go:117] "RemoveContainer" containerID="bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.261635 4773 scope.go:117] "RemoveContainer" containerID="e203a211f93f27ff720239411e192a2a8202e1fdc890ba783dd23386fddbb4d9" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.262992 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerDied","Data":"1625767637a46e5f2b3baddec627eb33e35f3d60f07f8dd4d48dfcd1b431dfef"} Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.263020 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1625767637a46e5f2b3baddec627eb33e35f3d60f07f8dd4d48dfcd1b431dfef" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.263041 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.286379 4773 scope.go:117] "RemoveContainer" containerID="7780978bfbf7070fdc6e2326036f3be82707f66d864695cb582db2f78e403bd9" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.331639 4773 scope.go:117] "RemoveContainer" containerID="100cf16d899578784656d12cf1cb2bef2afdb76869ab58de6601f1ccfb0932d7" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.643392 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 18:58:18 crc kubenswrapper[4773]: E0120 18:58:18.643965 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.643978 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.644189 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.644761 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.651425 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.692794 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.692838 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.693086 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.693388 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.795984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.796309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.796440 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.897365 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.898194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.898322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.902300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.902773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.915518 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:19 crc kubenswrapper[4773]: I0120 18:58:19.013443 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:19 crc kubenswrapper[4773]: I0120 18:58:19.514795 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 18:58:19 crc kubenswrapper[4773]: I0120 18:58:19.527217 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:58:20 crc kubenswrapper[4773]: I0120 18:58:20.282252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerStarted","Data":"214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b"} Jan 20 18:58:20 crc kubenswrapper[4773]: I0120 18:58:20.282592 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerStarted","Data":"876525de22f34cf7d9a438ff020ac1ffb25cb8490ee813204b2c6ecb76f7ee31"} Jan 20 18:58:20 crc kubenswrapper[4773]: I0120 18:58:20.300625 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" podStartSLOduration=1.791173396 podStartE2EDuration="2.3006069s" podCreationTimestamp="2026-01-20 18:58:18 +0000 UTC" firstStartedPulling="2026-01-20 18:58:19.527026757 +0000 UTC m=+1692.448839781" lastFinishedPulling="2026-01-20 18:58:20.036460261 +0000 UTC m=+1692.958273285" observedRunningTime="2026-01-20 18:58:20.299636647 +0000 UTC m=+1693.221449681" watchObservedRunningTime="2026-01-20 18:58:20.3006069 +0000 UTC m=+1693.222419924" Jan 20 18:58:23 crc kubenswrapper[4773]: I0120 18:58:23.037438 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:58:23 crc kubenswrapper[4773]: I0120 18:58:23.044772 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:58:23 crc kubenswrapper[4773]: I0120 18:58:23.470273 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" path="/var/lib/kubelet/pods/7f5455e9-7072-4154-b881-75a1da2c0466/volumes" Jan 20 18:58:25 crc kubenswrapper[4773]: I0120 18:58:25.323296 4773 generic.go:334] "Generic (PLEG): container finished" podID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerID="214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b" exitCode=0 Jan 20 18:58:25 crc kubenswrapper[4773]: I0120 18:58:25.323391 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerDied","Data":"214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b"} Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.724315 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.758027 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.758371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.758483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.764114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp" (OuterVolumeSpecName: "kube-api-access-wdntp") pod "8f7fa4e8-571e-47fe-9e86-e83acb77eb77" (UID: "8f7fa4e8-571e-47fe-9e86-e83acb77eb77"). InnerVolumeSpecName "kube-api-access-wdntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.783758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8f7fa4e8-571e-47fe-9e86-e83acb77eb77" (UID: "8f7fa4e8-571e-47fe-9e86-e83acb77eb77"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.785558 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory" (OuterVolumeSpecName: "inventory") pod "8f7fa4e8-571e-47fe-9e86-e83acb77eb77" (UID: "8f7fa4e8-571e-47fe-9e86-e83acb77eb77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.860425 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.860455 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.860465 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.340105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerDied","Data":"876525de22f34cf7d9a438ff020ac1ffb25cb8490ee813204b2c6ecb76f7ee31"} Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.340367 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="876525de22f34cf7d9a438ff020ac1ffb25cb8490ee813204b2c6ecb76f7ee31" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.340196 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.432488 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 18:58:27 crc kubenswrapper[4773]: E0120 18:58:27.432894 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.432916 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.433169 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.433837 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440399 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440462 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440503 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440848 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.465626 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.572457 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.572528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.572599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.674894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.675355 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.675731 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.681534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.684970 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.690777 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.752977 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:28 crc kubenswrapper[4773]: I0120 18:58:28.237030 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 18:58:28 crc kubenswrapper[4773]: I0120 18:58:28.348574 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerStarted","Data":"1497b09b7f36c51b1b23fc5cdc948a4900c4421f77e9b7565a5d483bfe19b9e6"} Jan 20 18:58:29 crc kubenswrapper[4773]: I0120 18:58:29.357276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerStarted","Data":"275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d"} Jan 20 18:58:29 crc kubenswrapper[4773]: I0120 18:58:29.382889 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" podStartSLOduration=1.825537009 podStartE2EDuration="2.382863766s" podCreationTimestamp="2026-01-20 18:58:27 +0000 UTC" firstStartedPulling="2026-01-20 18:58:28.247673418 +0000 UTC m=+1701.169486442" lastFinishedPulling="2026-01-20 18:58:28.805000175 +0000 UTC m=+1701.726813199" observedRunningTime="2026-01-20 18:58:29.374179118 +0000 UTC m=+1702.295992152" watchObservedRunningTime="2026-01-20 18:58:29.382863766 +0000 UTC m=+1702.304676790" Jan 20 18:58:30 crc kubenswrapper[4773]: I0120 18:58:30.447664 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:30 crc kubenswrapper[4773]: E0120 18:58:30.448372 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:35 crc kubenswrapper[4773]: I0120 18:58:35.029248 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:58:35 crc kubenswrapper[4773]: I0120 18:58:35.036356 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:58:35 crc kubenswrapper[4773]: I0120 18:58:35.458460 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" path="/var/lib/kubelet/pods/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.034397 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.043318 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.050405 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.071137 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.079083 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.088707 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.095541 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.102730 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.110166 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.117731 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.124676 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.132008 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.458398 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" path="/var/lib/kubelet/pods/181581ac-d6d3-4700-bfb7-7179a262a27c/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.459288 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" path="/var/lib/kubelet/pods/b2544d2a-4467-4356-9aee-21a75f6efedc/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.459888 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" path="/var/lib/kubelet/pods/b313ef44-3ec0-4e2e-bc88-0187cce26783/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.460431 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" path="/var/lib/kubelet/pods/b742ea09-e1ce-4311-a9bf-7736d3ab235c/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.461383 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" path="/var/lib/kubelet/pods/be215ecb-8014-4db1-8eac-59f0d3dee870/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.461853 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" path="/var/lib/kubelet/pods/d813dade-efd1-404d-ae3f-ecea71ffb5ee/volumes" Jan 20 18:58:42 crc kubenswrapper[4773]: I0120 18:58:42.446863 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:42 crc kubenswrapper[4773]: E0120 18:58:42.447188 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:46 crc kubenswrapper[4773]: I0120 18:58:46.028570 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:58:46 crc kubenswrapper[4773]: I0120 18:58:46.036578 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:58:47 crc kubenswrapper[4773]: I0120 18:58:47.460707 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" path="/var/lib/kubelet/pods/49d41a48-da79-4b93-bf84-ab8b94fed1c1/volumes" Jan 20 18:58:53 crc kubenswrapper[4773]: I0120 18:58:53.446695 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:53 crc kubenswrapper[4773]: E0120 18:58:53.447595 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:05 crc kubenswrapper[4773]: I0120 18:59:05.643890 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerID="275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d" exitCode=0 Jan 20 18:59:05 crc kubenswrapper[4773]: I0120 18:59:05.644017 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerDied","Data":"275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d"} Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.028354 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.176091 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.176230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.176317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.182235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft" (OuterVolumeSpecName: "kube-api-access-rp2ft") pod "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" (UID: "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf"). InnerVolumeSpecName "kube-api-access-rp2ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.202197 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory" (OuterVolumeSpecName: "inventory") pod "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" (UID: "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.202364 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" (UID: "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.278332 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.278376 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.278395 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.453349 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:07 crc kubenswrapper[4773]: E0120 18:59:07.453590 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.659884 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerDied","Data":"1497b09b7f36c51b1b23fc5cdc948a4900c4421f77e9b7565a5d483bfe19b9e6"} Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.660249 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1497b09b7f36c51b1b23fc5cdc948a4900c4421f77e9b7565a5d483bfe19b9e6" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.660029 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.746891 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 18:59:07 crc kubenswrapper[4773]: E0120 18:59:07.747295 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.747313 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.747475 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.748084 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.749957 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.750105 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.750342 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.750342 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.758225 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.790654 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.790748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.790863 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.892305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.892448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.892484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.896887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.901635 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.911045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:08 crc kubenswrapper[4773]: I0120 18:59:08.063785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:08 crc kubenswrapper[4773]: I0120 18:59:08.565474 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 18:59:08 crc kubenswrapper[4773]: W0120 18:59:08.571243 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4be003e8_2c0f_45c8_944d_b126c8cbd1b0.slice/crio-d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e WatchSource:0}: Error finding container d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e: Status 404 returned error can't find the container with id d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e Jan 20 18:59:08 crc kubenswrapper[4773]: I0120 18:59:08.668147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerStarted","Data":"d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e"} Jan 20 18:59:09 crc kubenswrapper[4773]: I0120 18:59:09.676567 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerStarted","Data":"d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb"} Jan 20 18:59:09 crc kubenswrapper[4773]: I0120 18:59:09.695073 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" podStartSLOduration=2.13248792 podStartE2EDuration="2.695053573s" podCreationTimestamp="2026-01-20 18:59:07 +0000 UTC" firstStartedPulling="2026-01-20 18:59:08.5746347 +0000 UTC m=+1741.496447764" lastFinishedPulling="2026-01-20 18:59:09.137200373 +0000 UTC m=+1742.059013417" observedRunningTime="2026-01-20 18:59:09.687944473 +0000 UTC m=+1742.609757517" watchObservedRunningTime="2026-01-20 18:59:09.695053573 +0000 UTC m=+1742.616866597" Jan 20 18:59:13 crc kubenswrapper[4773]: E0120 18:59:13.220048 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:13 crc kubenswrapper[4773]: I0120 18:59:13.730525 4773 generic.go:334] "Generic (PLEG): container finished" podID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerID="d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb" exitCode=0 Jan 20 18:59:13 crc kubenswrapper[4773]: I0120 18:59:13.730612 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerDied","Data":"d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb"} Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.173922 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.328377 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.328465 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.328565 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.333823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48" (OuterVolumeSpecName: "kube-api-access-4tj48") pod "4be003e8-2c0f-45c8-944d-b126c8cbd1b0" (UID: "4be003e8-2c0f-45c8-944d-b126c8cbd1b0"). InnerVolumeSpecName "kube-api-access-4tj48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.353844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4be003e8-2c0f-45c8-944d-b126c8cbd1b0" (UID: "4be003e8-2c0f-45c8-944d-b126c8cbd1b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.355366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory" (OuterVolumeSpecName: "inventory") pod "4be003e8-2c0f-45c8-944d-b126c8cbd1b0" (UID: "4be003e8-2c0f-45c8-944d-b126c8cbd1b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.430810 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.430853 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.430862 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.768319 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerDied","Data":"d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e"} Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.768366 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.768460 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.111846 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 18:59:16 crc kubenswrapper[4773]: E0120 18:59:16.112629 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.112654 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.112855 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.113621 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.116841 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.117030 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.119892 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.121572 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.125857 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.142983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.143298 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.143389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.245388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.245443 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.245558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.251661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.251674 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.264982 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.433503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.907325 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 18:59:17 crc kubenswrapper[4773]: I0120 18:59:17.790226 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerStarted","Data":"c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6"} Jan 20 18:59:17 crc kubenswrapper[4773]: I0120 18:59:17.790602 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerStarted","Data":"0b6c753b91a3baa36eafa63cc8700d1d3a75165653080a4f8e99b47ca4a5d1da"} Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.436245 4773 scope.go:117] "RemoveContainer" containerID="c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.449109 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:18 crc kubenswrapper[4773]: E0120 18:59:18.449400 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.458981 4773 scope.go:117] "RemoveContainer" containerID="80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.531827 4773 scope.go:117] "RemoveContainer" containerID="7ac50ea7174c7d5687e310f246288e88881d9a99f2dc7333211966358f9e13de" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.551049 4773 scope.go:117] "RemoveContainer" containerID="17b8b7c8cb845ba0251348f763aac9652f97d99f1d3fb0947416ad8e58f06104" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.566958 4773 scope.go:117] "RemoveContainer" containerID="19b2a1461c2e62cae82675b27637cd9300c36d95cb1554d31376193faaa94e3d" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.624340 4773 scope.go:117] "RemoveContainer" containerID="a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.689539 4773 scope.go:117] "RemoveContainer" containerID="59d2c461099d25c608c6562b9d212406fcc710ae864054a0764c29095622613a" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.705832 4773 scope.go:117] "RemoveContainer" containerID="3fe035cd5db85387fabc74e84605df50a425cce1a8ad3c3850fcd55fb4b1eaa6" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.747026 4773 scope.go:117] "RemoveContainer" containerID="a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84" Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.032811 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" podStartSLOduration=5.392668197 podStartE2EDuration="6.032793948s" podCreationTimestamp="2026-01-20 18:59:15 +0000 UTC" firstStartedPulling="2026-01-20 18:59:16.919513665 +0000 UTC m=+1749.841326689" lastFinishedPulling="2026-01-20 18:59:17.559639416 +0000 UTC m=+1750.481452440" observedRunningTime="2026-01-20 18:59:17.809574912 +0000 UTC m=+1750.731387956" watchObservedRunningTime="2026-01-20 18:59:21.032793948 +0000 UTC m=+1753.954606972" Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.039910 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.046557 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.456972 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" path="/var/lib/kubelet/pods/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a/volumes" Jan 20 18:59:23 crc kubenswrapper[4773]: E0120 18:59:23.418974 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.028333 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.038733 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.048601 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.060851 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.460557 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" path="/var/lib/kubelet/pods/0158a06a-bb30-4d75-904f-90a4c6307fd6/volumes" Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.461775 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" path="/var/lib/kubelet/pods/17ca6753-a956-4078-8927-2f2a6c41cb80/volumes" Jan 20 18:59:30 crc kubenswrapper[4773]: I0120 18:59:30.447243 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:30 crc kubenswrapper[4773]: E0120 18:59:30.448676 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:33 crc kubenswrapper[4773]: E0120 18:59:33.655786 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:38 crc kubenswrapper[4773]: I0120 18:59:38.035680 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:59:38 crc kubenswrapper[4773]: I0120 18:59:38.047698 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.031211 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.040364 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.459149 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" path="/var/lib/kubelet/pods/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b/volumes" Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.460271 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" path="/var/lib/kubelet/pods/d9eee838-721f-48cc-a5aa-37644a62d846/volumes" Jan 20 18:59:41 crc kubenswrapper[4773]: I0120 18:59:41.447290 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:41 crc kubenswrapper[4773]: E0120 18:59:41.447819 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:43 crc kubenswrapper[4773]: E0120 18:59:43.877691 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:54 crc kubenswrapper[4773]: E0120 18:59:54.075622 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:54 crc kubenswrapper[4773]: I0120 18:59:54.447589 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:54 crc kubenswrapper[4773]: E0120 18:59:54.448204 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.148668 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp"] Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.150246 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.152899 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.152914 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.165206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp"] Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.282921 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.283052 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.283155 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.384788 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.384857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.384954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.385841 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.393454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.402549 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.521809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.953628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp"] Jan 20 19:00:01 crc kubenswrapper[4773]: I0120 19:00:01.162476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerStarted","Data":"c5b2b46bfd57b06bfe31b87e36761ea26d02a9b817e679d1c4c790f56bcd1486"} Jan 20 19:00:01 crc kubenswrapper[4773]: I0120 19:00:01.162786 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerStarted","Data":"f4f949c7dc9bc44e44b29566d181d37819bd618f88c494896b787e0da39ec321"} Jan 20 19:00:01 crc kubenswrapper[4773]: I0120 19:00:01.180820 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" podStartSLOduration=1.180800462 podStartE2EDuration="1.180800462s" podCreationTimestamp="2026-01-20 19:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:00:01.176920958 +0000 UTC m=+1794.098733982" watchObservedRunningTime="2026-01-20 19:00:01.180800462 +0000 UTC m=+1794.102613506" Jan 20 19:00:02 crc kubenswrapper[4773]: I0120 19:00:02.184739 4773 generic.go:334] "Generic (PLEG): container finished" podID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerID="c5b2b46bfd57b06bfe31b87e36761ea26d02a9b817e679d1c4c790f56bcd1486" exitCode=0 Jan 20 19:00:02 crc kubenswrapper[4773]: I0120 19:00:02.184806 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerDied","Data":"c5b2b46bfd57b06bfe31b87e36761ea26d02a9b817e679d1c4c790f56bcd1486"} Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.496478 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.647378 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.647689 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.647721 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.648097 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "48dbb315-9da1-4b84-9a8e-86448b7ce2bf" (UID: "48dbb315-9da1-4b84-9a8e-86448b7ce2bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.648433 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.653455 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "48dbb315-9da1-4b84-9a8e-86448b7ce2bf" (UID: "48dbb315-9da1-4b84-9a8e-86448b7ce2bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.653511 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2" (OuterVolumeSpecName: "kube-api-access-btzj2") pod "48dbb315-9da1-4b84-9a8e-86448b7ce2bf" (UID: "48dbb315-9da1-4b84-9a8e-86448b7ce2bf"). InnerVolumeSpecName "kube-api-access-btzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.749881 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.749951 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:04 crc kubenswrapper[4773]: I0120 19:00:04.207439 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerDied","Data":"f4f949c7dc9bc44e44b29566d181d37819bd618f88c494896b787e0da39ec321"} Jan 20 19:00:04 crc kubenswrapper[4773]: I0120 19:00:04.207502 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f949c7dc9bc44e44b29566d181d37819bd618f88c494896b787e0da39ec321" Jan 20 19:00:04 crc kubenswrapper[4773]: I0120 19:00:04.207511 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:04 crc kubenswrapper[4773]: E0120 19:00:04.310771 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48dbb315_9da1_4b84_9a8e_86448b7ce2bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 19:00:05 crc kubenswrapper[4773]: I0120 19:00:05.447231 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:05 crc kubenswrapper[4773]: E0120 19:00:05.447765 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:06 crc kubenswrapper[4773]: I0120 19:00:06.229528 4773 generic.go:334] "Generic (PLEG): container finished" podID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerID="c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6" exitCode=0 Jan 20 19:00:06 crc kubenswrapper[4773]: I0120 19:00:06.229609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerDied","Data":"c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6"} Jan 20 19:00:07 crc kubenswrapper[4773]: E0120 19:00:07.435908 4773 info.go:109] Failed to get network devices: open /sys/class/net/0b6c753b91a3baa/address: no such file or directory Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.602683 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.722198 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"5f64745d-73ee-4219-b71f-b08d15f94f68\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.722278 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"5f64745d-73ee-4219-b71f-b08d15f94f68\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.722309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"5f64745d-73ee-4219-b71f-b08d15f94f68\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.728709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx" (OuterVolumeSpecName: "kube-api-access-zhtkx") pod "5f64745d-73ee-4219-b71f-b08d15f94f68" (UID: "5f64745d-73ee-4219-b71f-b08d15f94f68"). InnerVolumeSpecName "kube-api-access-zhtkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.746104 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory" (OuterVolumeSpecName: "inventory") pod "5f64745d-73ee-4219-b71f-b08d15f94f68" (UID: "5f64745d-73ee-4219-b71f-b08d15f94f68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.748229 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5f64745d-73ee-4219-b71f-b08d15f94f68" (UID: "5f64745d-73ee-4219-b71f-b08d15f94f68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.824499 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.824536 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.824549 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.250335 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerDied","Data":"0b6c753b91a3baa36eafa63cc8700d1d3a75165653080a4f8e99b47ca4a5d1da"} Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.250379 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6c753b91a3baa36eafa63cc8700d1d3a75165653080a4f8e99b47ca4a5d1da" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.250416 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332288 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:00:08 crc kubenswrapper[4773]: E0120 19:00:08.332730 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerName="collect-profiles" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332747 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerName="collect-profiles" Jan 20 19:00:08 crc kubenswrapper[4773]: E0120 19:00:08.332772 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332780 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332987 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.333008 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerName="collect-profiles" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.333566 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.339223 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.339228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.339427 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.347676 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.363690 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.432838 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.433151 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.433810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.535635 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.535990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.536102 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.540526 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.541633 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.553141 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.682350 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:09 crc kubenswrapper[4773]: I0120 19:00:09.224954 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:00:09 crc kubenswrapper[4773]: I0120 19:00:09.260584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerStarted","Data":"4aed70e28a34cc771d471bb202c2fb3239ceb0f7ad21e660a68c198dd9fe8e18"} Jan 20 19:00:10 crc kubenswrapper[4773]: I0120 19:00:10.270426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerStarted","Data":"93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308"} Jan 20 19:00:10 crc kubenswrapper[4773]: I0120 19:00:10.301918 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" podStartSLOduration=1.7794861050000002 podStartE2EDuration="2.301891022s" podCreationTimestamp="2026-01-20 19:00:08 +0000 UTC" firstStartedPulling="2026-01-20 19:00:09.23138112 +0000 UTC m=+1802.153194144" lastFinishedPulling="2026-01-20 19:00:09.753786037 +0000 UTC m=+1802.675599061" observedRunningTime="2026-01-20 19:00:10.28686696 +0000 UTC m=+1803.208680014" watchObservedRunningTime="2026-01-20 19:00:10.301891022 +0000 UTC m=+1803.223704076" Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.040879 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.049283 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.317006 4773 generic.go:334] "Generic (PLEG): container finished" podID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerID="93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308" exitCode=0 Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.317060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerDied","Data":"93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308"} Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.029842 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.039779 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.454456 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:17 crc kubenswrapper[4773]: E0120 19:00:17.455010 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.480392 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" path="/var/lib/kubelet/pods/833eac91-4269-4e1e-9923-8dd8ed2276dc/volumes" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.480967 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" path="/var/lib/kubelet/pods/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7/volumes" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.701326 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.802322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"4966c538-33c7-4d94-9705-0081ce04e9ef\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.802409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"4966c538-33c7-4d94-9705-0081ce04e9ef\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.802434 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"4966c538-33c7-4d94-9705-0081ce04e9ef\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.817812 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl" (OuterVolumeSpecName: "kube-api-access-jvbpl") pod "4966c538-33c7-4d94-9705-0081ce04e9ef" (UID: "4966c538-33c7-4d94-9705-0081ce04e9ef"). InnerVolumeSpecName "kube-api-access-jvbpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.826709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4966c538-33c7-4d94-9705-0081ce04e9ef" (UID: "4966c538-33c7-4d94-9705-0081ce04e9ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.834684 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4966c538-33c7-4d94-9705-0081ce04e9ef" (UID: "4966c538-33c7-4d94-9705-0081ce04e9ef"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.904879 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.904928 4773 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.904956 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.034427 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.040411 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.047707 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.055282 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.063258 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.069510 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.076705 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.084007 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.334372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerDied","Data":"4aed70e28a34cc771d471bb202c2fb3239ceb0f7ad21e660a68c198dd9fe8e18"} Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.334414 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aed70e28a34cc771d471bb202c2fb3239ceb0f7ad21e660a68c198dd9fe8e18" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.334471 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.409184 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:00:18 crc kubenswrapper[4773]: E0120 19:00:18.409664 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.409685 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.409901 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.410617 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.414075 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.414323 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.414828 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.415024 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.430302 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.516702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.516771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.516832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.618675 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.618765 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.618883 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.623091 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.623260 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.638560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.730020 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.922317 4773 scope.go:117] "RemoveContainer" containerID="dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.968350 4773 scope.go:117] "RemoveContainer" containerID="3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.997364 4773 scope.go:117] "RemoveContainer" containerID="33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.015125 4773 scope.go:117] "RemoveContainer" containerID="bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.049561 4773 scope.go:117] "RemoveContainer" containerID="1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.082114 4773 scope.go:117] "RemoveContainer" containerID="7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.097623 4773 scope.go:117] "RemoveContainer" containerID="afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.235478 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.341602 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerStarted","Data":"20c019dd6dfc037936553ca8b8283032064012141a0a26a9b449b3f1e4c13847"} Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.457107 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" path="/var/lib/kubelet/pods/2bd3a449-dc14-46ca-8e19-64d0a282483e/volumes" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.458032 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" path="/var/lib/kubelet/pods/47dcb7c9-ffa7-46bc-b695-02aea6e679a1/volumes" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.458586 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7455911e-a1ad-442b-97b9-362496066bbf" path="/var/lib/kubelet/pods/7455911e-a1ad-442b-97b9-362496066bbf/volumes" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.459126 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" path="/var/lib/kubelet/pods/f4f47b18-303f-415d-8bf8-c1f7a075b747/volumes" Jan 20 19:00:20 crc kubenswrapper[4773]: I0120 19:00:20.365651 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerStarted","Data":"8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261"} Jan 20 19:00:20 crc kubenswrapper[4773]: I0120 19:00:20.391144 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" podStartSLOduration=1.90963208 podStartE2EDuration="2.391126191s" podCreationTimestamp="2026-01-20 19:00:18 +0000 UTC" firstStartedPulling="2026-01-20 19:00:19.242421047 +0000 UTC m=+1812.164234071" lastFinishedPulling="2026-01-20 19:00:19.723915158 +0000 UTC m=+1812.645728182" observedRunningTime="2026-01-20 19:00:20.38568994 +0000 UTC m=+1813.307502974" watchObservedRunningTime="2026-01-20 19:00:20.391126191 +0000 UTC m=+1813.312939215" Jan 20 19:00:27 crc kubenswrapper[4773]: I0120 19:00:27.429311 4773 generic.go:334] "Generic (PLEG): container finished" podID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerID="8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261" exitCode=0 Jan 20 19:00:27 crc kubenswrapper[4773]: I0120 19:00:27.429382 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerDied","Data":"8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261"} Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.799687 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.899289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.899344 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.899408 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.905211 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q" (OuterVolumeSpecName: "kube-api-access-8lx7q") pod "3ef8874c-43f1-43c9-ac7c-0af15c430e89" (UID: "3ef8874c-43f1-43c9-ac7c-0af15c430e89"). InnerVolumeSpecName "kube-api-access-8lx7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.923518 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3ef8874c-43f1-43c9-ac7c-0af15c430e89" (UID: "3ef8874c-43f1-43c9-ac7c-0af15c430e89"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.932070 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory" (OuterVolumeSpecName: "inventory") pod "3ef8874c-43f1-43c9-ac7c-0af15c430e89" (UID: "3ef8874c-43f1-43c9-ac7c-0af15c430e89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.001571 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.001613 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.001630 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.451678 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.463106 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerDied","Data":"20c019dd6dfc037936553ca8b8283032064012141a0a26a9b449b3f1e4c13847"} Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.463158 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c019dd6dfc037936553ca8b8283032064012141a0a26a9b449b3f1e4c13847" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.542864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:00:29 crc kubenswrapper[4773]: E0120 19:00:29.543438 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.543468 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.543688 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.544590 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551319 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551732 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551973 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.554405 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.620380 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.620471 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.620507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.722384 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.722859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.722978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.727655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.730490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.754819 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.868571 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:30 crc kubenswrapper[4773]: I0120 19:00:30.402763 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:00:30 crc kubenswrapper[4773]: W0120 19:00:30.408172 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10bff0cd_1771_46dc_87e8_a7ce91f520c8.slice/crio-8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23 WatchSource:0}: Error finding container 8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23: Status 404 returned error can't find the container with id 8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23 Jan 20 19:00:30 crc kubenswrapper[4773]: I0120 19:00:30.459789 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerStarted","Data":"8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23"} Jan 20 19:00:31 crc kubenswrapper[4773]: I0120 19:00:31.448080 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:31 crc kubenswrapper[4773]: E0120 19:00:31.448588 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:31 crc kubenswrapper[4773]: I0120 19:00:31.468892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerStarted","Data":"737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df"} Jan 20 19:00:31 crc kubenswrapper[4773]: I0120 19:00:31.506446 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" podStartSLOduration=2.031381184 podStartE2EDuration="2.506420961s" podCreationTimestamp="2026-01-20 19:00:29 +0000 UTC" firstStartedPulling="2026-01-20 19:00:30.412630498 +0000 UTC m=+1823.334443522" lastFinishedPulling="2026-01-20 19:00:30.887670275 +0000 UTC m=+1823.809483299" observedRunningTime="2026-01-20 19:00:31.487686189 +0000 UTC m=+1824.409499253" watchObservedRunningTime="2026-01-20 19:00:31.506420961 +0000 UTC m=+1824.428234015" Jan 20 19:00:41 crc kubenswrapper[4773]: I0120 19:00:41.551419 4773 generic.go:334] "Generic (PLEG): container finished" podID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerID="737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df" exitCode=0 Jan 20 19:00:41 crc kubenswrapper[4773]: I0120 19:00:41.551510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerDied","Data":"737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df"} Jan 20 19:00:42 crc kubenswrapper[4773]: I0120 19:00:42.997324 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.055119 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.055233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.055308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.066010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt" (OuterVolumeSpecName: "kube-api-access-5f7qt") pod "10bff0cd-1771-46dc-87e8-a7ce91f520c8" (UID: "10bff0cd-1771-46dc-87e8-a7ce91f520c8"). InnerVolumeSpecName "kube-api-access-5f7qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.082247 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "10bff0cd-1771-46dc-87e8-a7ce91f520c8" (UID: "10bff0cd-1771-46dc-87e8-a7ce91f520c8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.100045 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory" (OuterVolumeSpecName: "inventory") pod "10bff0cd-1771-46dc-87e8-a7ce91f520c8" (UID: "10bff0cd-1771-46dc-87e8-a7ce91f520c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.157905 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.157943 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.157953 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.574670 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.574840 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerDied","Data":"8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23"} Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.575396 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23" Jan 20 19:00:46 crc kubenswrapper[4773]: I0120 19:00:46.447272 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:46 crc kubenswrapper[4773]: E0120 19:00:46.447540 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:51 crc kubenswrapper[4773]: I0120 19:00:51.057374 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 19:00:51 crc kubenswrapper[4773]: I0120 19:00:51.067553 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 19:00:51 crc kubenswrapper[4773]: I0120 19:00:51.457470 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" path="/var/lib/kubelet/pods/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4/volumes" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.148620 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29482261-dqww9"] Jan 20 19:01:00 crc kubenswrapper[4773]: E0120 19:01:00.149360 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.149381 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.149618 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.150221 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.158614 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29482261-dqww9"] Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256072 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256192 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358405 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358500 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358533 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.365191 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.365739 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.371393 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.376382 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.483073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:00.938250 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29482261-dqww9"] Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.450023 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.733853 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerStarted","Data":"e6c25d2d7edcc9942250bf6b80ef72165e3d0474f9d6e264bd2c80435f9b0198"} Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.733904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerStarted","Data":"42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c"} Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.737372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7"} Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.773847 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29482261-dqww9" podStartSLOduration=1.773823658 podStartE2EDuration="1.773823658s" podCreationTimestamp="2026-01-20 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:01:01.766232776 +0000 UTC m=+1854.688045880" watchObservedRunningTime="2026-01-20 19:01:01.773823658 +0000 UTC m=+1854.695636712" Jan 20 19:01:03 crc kubenswrapper[4773]: I0120 19:01:03.756790 4773 generic.go:334] "Generic (PLEG): container finished" podID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerID="e6c25d2d7edcc9942250bf6b80ef72165e3d0474f9d6e264bd2c80435f9b0198" exitCode=0 Jan 20 19:01:03 crc kubenswrapper[4773]: I0120 19:01:03.756894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerDied","Data":"e6c25d2d7edcc9942250bf6b80ef72165e3d0474f9d6e264bd2c80435f9b0198"} Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.086743 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246500 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246602 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246628 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246740 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.252863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.260109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx" (OuterVolumeSpecName: "kube-api-access-t22gx") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "kube-api-access-t22gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.285582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.312856 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data" (OuterVolumeSpecName: "config-data") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349198 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349375 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349451 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349548 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: E0120 19:01:05.626995 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0951c0_055b_44bd_a686_9a4938af6b4f.slice/crio-42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0951c0_055b_44bd_a686_9a4938af6b4f.slice\": RecentStats: unable to find data in memory cache]" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.774556 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerDied","Data":"42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c"} Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.774593 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.774610 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:11 crc kubenswrapper[4773]: I0120 19:01:11.030029 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 19:01:11 crc kubenswrapper[4773]: I0120 19:01:11.037317 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 19:01:11 crc kubenswrapper[4773]: I0120 19:01:11.456259 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" path="/var/lib/kubelet/pods/61de3b4b-bcb7-4521-92e6-af87d03407ee/volumes" Jan 20 19:01:14 crc kubenswrapper[4773]: I0120 19:01:14.037333 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 19:01:14 crc kubenswrapper[4773]: I0120 19:01:14.046676 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 19:01:15 crc kubenswrapper[4773]: I0120 19:01:15.458115 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" path="/var/lib/kubelet/pods/9f9293b5-8288-4a19-b3ac-03d8026dbf06/volumes" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.215793 4773 scope.go:117] "RemoveContainer" containerID="6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.250383 4773 scope.go:117] "RemoveContainer" containerID="2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.288995 4773 scope.go:117] "RemoveContainer" containerID="eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.364851 4773 scope.go:117] "RemoveContainer" containerID="8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.413078 4773 scope.go:117] "RemoveContainer" containerID="ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.435072 4773 scope.go:117] "RemoveContainer" containerID="9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.471127 4773 scope.go:117] "RemoveContainer" containerID="ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d" Jan 20 19:01:58 crc kubenswrapper[4773]: I0120 19:01:58.040744 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 19:01:58 crc kubenswrapper[4773]: I0120 19:01:58.048516 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 19:01:59 crc kubenswrapper[4773]: I0120 19:01:59.456541 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" path="/var/lib/kubelet/pods/24c1bc90-8fe0-41b4-a7ba-7e15bc787386/volumes" Jan 20 19:02:19 crc kubenswrapper[4773]: I0120 19:02:19.612461 4773 scope.go:117] "RemoveContainer" containerID="7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0" Jan 20 19:03:28 crc kubenswrapper[4773]: I0120 19:03:28.170124 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:03:28 crc kubenswrapper[4773]: I0120 19:03:28.170792 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.784150 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:03:54 crc kubenswrapper[4773]: E0120 19:03:54.785107 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerName="keystone-cron" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.785125 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerName="keystone-cron" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.785339 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerName="keystone-cron" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.786842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.800172 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.943211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.943281 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.943556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045267 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.074723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.108210 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.560377 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.223253 4773 generic.go:334] "Generic (PLEG): container finished" podID="187a0b19-a497-4133-a631-23e2c38c8e90" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" exitCode=0 Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.223359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc"} Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.223617 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerStarted","Data":"610302889b4174a8a42476f71a30781764aa2d78797362e8ed51738a7af8d517"} Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.225237 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:03:57 crc kubenswrapper[4773]: I0120 19:03:57.232994 4773 generic.go:334] "Generic (PLEG): container finished" podID="187a0b19-a497-4133-a631-23e2c38c8e90" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" exitCode=0 Jan 20 19:03:57 crc kubenswrapper[4773]: I0120 19:03:57.233053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5"} Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.170436 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.170777 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.255734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerStarted","Data":"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692"} Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.278000 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z8cll" podStartSLOduration=2.77191214 podStartE2EDuration="4.277983701s" podCreationTimestamp="2026-01-20 19:03:54 +0000 UTC" firstStartedPulling="2026-01-20 19:03:56.225025246 +0000 UTC m=+2029.146838270" lastFinishedPulling="2026-01-20 19:03:57.731096807 +0000 UTC m=+2030.652909831" observedRunningTime="2026-01-20 19:03:58.272466025 +0000 UTC m=+2031.194279059" watchObservedRunningTime="2026-01-20 19:03:58.277983701 +0000 UTC m=+2031.199796725" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.108351 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.109022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.151224 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.349096 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.397295 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:04:07 crc kubenswrapper[4773]: I0120 19:04:07.323612 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z8cll" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" containerID="cri-o://fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" gracePeriod=2 Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.007781 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.185883 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"187a0b19-a497-4133-a631-23e2c38c8e90\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.186020 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"187a0b19-a497-4133-a631-23e2c38c8e90\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.186126 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"187a0b19-a497-4133-a631-23e2c38c8e90\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.187473 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities" (OuterVolumeSpecName: "utilities") pod "187a0b19-a497-4133-a631-23e2c38c8e90" (UID: "187a0b19-a497-4133-a631-23e2c38c8e90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.192972 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p" (OuterVolumeSpecName: "kube-api-access-qfw8p") pod "187a0b19-a497-4133-a631-23e2c38c8e90" (UID: "187a0b19-a497-4133-a631-23e2c38c8e90"). InnerVolumeSpecName "kube-api-access-qfw8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.216337 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "187a0b19-a497-4133-a631-23e2c38c8e90" (UID: "187a0b19-a497-4133-a631-23e2c38c8e90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.288145 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.288184 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.288195 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333767 4773 generic.go:334] "Generic (PLEG): container finished" podID="187a0b19-a497-4133-a631-23e2c38c8e90" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" exitCode=0 Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333812 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692"} Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333830 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333848 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"610302889b4174a8a42476f71a30781764aa2d78797362e8ed51738a7af8d517"} Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333868 4773 scope.go:117] "RemoveContainer" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.357600 4773 scope.go:117] "RemoveContainer" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.374087 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.381634 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.398130 4773 scope.go:117] "RemoveContainer" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.424333 4773 scope.go:117] "RemoveContainer" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" Jan 20 19:04:08 crc kubenswrapper[4773]: E0120 19:04:08.424783 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692\": container with ID starting with fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692 not found: ID does not exist" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.424820 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692"} err="failed to get container status \"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692\": rpc error: code = NotFound desc = could not find container \"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692\": container with ID starting with fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692 not found: ID does not exist" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.424848 4773 scope.go:117] "RemoveContainer" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" Jan 20 19:04:08 crc kubenswrapper[4773]: E0120 19:04:08.425273 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5\": container with ID starting with b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5 not found: ID does not exist" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.425306 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5"} err="failed to get container status \"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5\": rpc error: code = NotFound desc = could not find container \"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5\": container with ID starting with b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5 not found: ID does not exist" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.425324 4773 scope.go:117] "RemoveContainer" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" Jan 20 19:04:08 crc kubenswrapper[4773]: E0120 19:04:08.425584 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc\": container with ID starting with 483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc not found: ID does not exist" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.425615 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc"} err="failed to get container status \"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc\": rpc error: code = NotFound desc = could not find container \"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc\": container with ID starting with 483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc not found: ID does not exist" Jan 20 19:04:09 crc kubenswrapper[4773]: I0120 19:04:09.457658 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" path="/var/lib/kubelet/pods/187a0b19-a497-4133-a631-23e2c38c8e90/volumes" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.070380 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:12 crc kubenswrapper[4773]: E0120 19:04:12.071317 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-content" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071334 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-content" Jan 20 19:04:12 crc kubenswrapper[4773]: E0120 19:04:12.071353 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-utilities" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071360 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-utilities" Jan 20 19:04:12 crc kubenswrapper[4773]: E0120 19:04:12.071367 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071373 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071543 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.073172 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.080350 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.150211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.150312 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.150367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.252329 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.252651 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.252690 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.253135 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.253268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.278142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.442765 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.883867 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:13 crc kubenswrapper[4773]: I0120 19:04:13.374285 4773 generic.go:334] "Generic (PLEG): container finished" podID="b623f23a-f663-4807-903e-2633ba066f8a" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" exitCode=0 Jan 20 19:04:13 crc kubenswrapper[4773]: I0120 19:04:13.374394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161"} Jan 20 19:04:13 crc kubenswrapper[4773]: I0120 19:04:13.375123 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerStarted","Data":"988b02aa46e97c905d50ba849c77f22209883cc3c9563e2d9c3d142f12351377"} Jan 20 19:04:14 crc kubenswrapper[4773]: I0120 19:04:14.385173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerStarted","Data":"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e"} Jan 20 19:04:15 crc kubenswrapper[4773]: I0120 19:04:15.395028 4773 generic.go:334] "Generic (PLEG): container finished" podID="b623f23a-f663-4807-903e-2633ba066f8a" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" exitCode=0 Jan 20 19:04:15 crc kubenswrapper[4773]: I0120 19:04:15.395072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e"} Jan 20 19:04:16 crc kubenswrapper[4773]: I0120 19:04:16.405737 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerStarted","Data":"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f"} Jan 20 19:04:16 crc kubenswrapper[4773]: I0120 19:04:16.430069 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qw5jk" podStartSLOduration=1.683221605 podStartE2EDuration="4.43004898s" podCreationTimestamp="2026-01-20 19:04:12 +0000 UTC" firstStartedPulling="2026-01-20 19:04:13.376442444 +0000 UTC m=+2046.298255468" lastFinishedPulling="2026-01-20 19:04:16.123269819 +0000 UTC m=+2049.045082843" observedRunningTime="2026-01-20 19:04:16.421833013 +0000 UTC m=+2049.343646057" watchObservedRunningTime="2026-01-20 19:04:16.43004898 +0000 UTC m=+2049.351862004" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.443969 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.445108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.500632 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.590228 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.739747 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:24 crc kubenswrapper[4773]: I0120 19:04:24.473142 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qw5jk" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" containerID="cri-o://6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" gracePeriod=2 Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.018917 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.092337 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"b623f23a-f663-4807-903e-2633ba066f8a\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.092788 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"b623f23a-f663-4807-903e-2633ba066f8a\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.093107 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"b623f23a-f663-4807-903e-2633ba066f8a\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.093911 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities" (OuterVolumeSpecName: "utilities") pod "b623f23a-f663-4807-903e-2633ba066f8a" (UID: "b623f23a-f663-4807-903e-2633ba066f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.112043 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px" (OuterVolumeSpecName: "kube-api-access-8p6px") pod "b623f23a-f663-4807-903e-2633ba066f8a" (UID: "b623f23a-f663-4807-903e-2633ba066f8a"). InnerVolumeSpecName "kube-api-access-8p6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.194798 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.194843 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.481965 4773 generic.go:334] "Generic (PLEG): container finished" podID="b623f23a-f663-4807-903e-2633ba066f8a" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" exitCode=0 Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482006 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f"} Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"988b02aa46e97c905d50ba849c77f22209883cc3c9563e2d9c3d142f12351377"} Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482053 4773 scope.go:117] "RemoveContainer" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482102 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.500105 4773 scope.go:117] "RemoveContainer" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.534419 4773 scope.go:117] "RemoveContainer" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.566140 4773 scope.go:117] "RemoveContainer" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" Jan 20 19:04:25 crc kubenswrapper[4773]: E0120 19:04:25.566517 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f\": container with ID starting with 6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f not found: ID does not exist" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.566550 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f"} err="failed to get container status \"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f\": rpc error: code = NotFound desc = could not find container \"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f\": container with ID starting with 6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f not found: ID does not exist" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.566569 4773 scope.go:117] "RemoveContainer" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" Jan 20 19:04:25 crc kubenswrapper[4773]: E0120 19:04:25.567006 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e\": container with ID starting with 98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e not found: ID does not exist" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.567091 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e"} err="failed to get container status \"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e\": rpc error: code = NotFound desc = could not find container \"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e\": container with ID starting with 98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e not found: ID does not exist" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.567158 4773 scope.go:117] "RemoveContainer" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" Jan 20 19:04:25 crc kubenswrapper[4773]: E0120 19:04:25.567663 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161\": container with ID starting with cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161 not found: ID does not exist" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.567736 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161"} err="failed to get container status \"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161\": rpc error: code = NotFound desc = could not find container \"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161\": container with ID starting with cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161 not found: ID does not exist" Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.405508 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b623f23a-f663-4807-903e-2633ba066f8a" (UID: "b623f23a-f663-4807-903e-2633ba066f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.418398 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.722875 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.734290 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:27 crc kubenswrapper[4773]: I0120 19:04:27.457845 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b623f23a-f663-4807-903e-2633ba066f8a" path="/var/lib/kubelet/pods/b623f23a-f663-4807-903e-2633ba066f8a/volumes" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.170539 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.170603 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.170648 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.171422 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.171478 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7" gracePeriod=600 Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.518085 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7" exitCode=0 Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.518149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7"} Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.520268 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1"} Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.520348 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:04:58 crc kubenswrapper[4773]: E0120 19:04:58.659378 4773 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:37860->38.102.83.39:34695: write tcp 38.102.83.39:37860->38.102.83.39:34695: write: broken pipe Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.721489 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:00 crc kubenswrapper[4773]: E0120 19:05:00.722061 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722073 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" Jan 20 19:05:00 crc kubenswrapper[4773]: E0120 19:05:00.722091 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-content" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722097 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-content" Jan 20 19:05:00 crc kubenswrapper[4773]: E0120 19:05:00.722116 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-utilities" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722122 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-utilities" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722283 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.723493 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.731795 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.753964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.754599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.754639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856366 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856845 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856987 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.857393 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.876825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.056973 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.575318 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:01 crc kubenswrapper[4773]: W0120 19:05:01.600828 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod461d1293_e47b_49e3_a6eb_bef90ad29792.slice/crio-c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec WatchSource:0}: Error finding container c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec: Status 404 returned error can't find the container with id c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.783465 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855"} Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.783792 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec"} Jan 20 19:05:02 crc kubenswrapper[4773]: I0120 19:05:02.794120 4773 generic.go:334] "Generic (PLEG): container finished" podID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" exitCode=0 Jan 20 19:05:02 crc kubenswrapper[4773]: I0120 19:05:02.794160 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855"} Jan 20 19:05:03 crc kubenswrapper[4773]: I0120 19:05:03.805663 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47"} Jan 20 19:05:04 crc kubenswrapper[4773]: I0120 19:05:04.827362 4773 generic.go:334] "Generic (PLEG): container finished" podID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" exitCode=0 Jan 20 19:05:04 crc kubenswrapper[4773]: I0120 19:05:04.827472 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47"} Jan 20 19:05:05 crc kubenswrapper[4773]: I0120 19:05:05.841672 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627"} Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.057377 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.058150 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.214194 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.236717 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lcnwg" podStartSLOduration=8.673459784 podStartE2EDuration="11.236703083s" podCreationTimestamp="2026-01-20 19:05:00 +0000 UTC" firstStartedPulling="2026-01-20 19:05:02.796147026 +0000 UTC m=+2095.717960050" lastFinishedPulling="2026-01-20 19:05:05.359390325 +0000 UTC m=+2098.281203349" observedRunningTime="2026-01-20 19:05:05.860595191 +0000 UTC m=+2098.782408235" watchObservedRunningTime="2026-01-20 19:05:11.236703083 +0000 UTC m=+2104.158516107" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.943556 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:13 crc kubenswrapper[4773]: I0120 19:05:13.289065 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:13 crc kubenswrapper[4773]: I0120 19:05:13.913239 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lcnwg" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" containerID="cri-o://5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" gracePeriod=2 Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.606371 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.745046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"461d1293-e47b-49e3-a6eb-bef90ad29792\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.745142 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"461d1293-e47b-49e3-a6eb-bef90ad29792\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.745230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"461d1293-e47b-49e3-a6eb-bef90ad29792\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.746191 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities" (OuterVolumeSpecName: "utilities") pod "461d1293-e47b-49e3-a6eb-bef90ad29792" (UID: "461d1293-e47b-49e3-a6eb-bef90ad29792"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.752238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96" (OuterVolumeSpecName: "kube-api-access-t7f96") pod "461d1293-e47b-49e3-a6eb-bef90ad29792" (UID: "461d1293-e47b-49e3-a6eb-bef90ad29792"). InnerVolumeSpecName "kube-api-access-t7f96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.790862 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "461d1293-e47b-49e3-a6eb-bef90ad29792" (UID: "461d1293-e47b-49e3-a6eb-bef90ad29792"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.847242 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.847278 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.847287 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922797 4773 generic.go:334] "Generic (PLEG): container finished" podID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" exitCode=0 Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922842 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627"} Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922865 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922886 4773 scope.go:117] "RemoveContainer" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec"} Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.942114 4773 scope.go:117] "RemoveContainer" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.966074 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.969585 4773 scope.go:117] "RemoveContainer" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.972719 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.002713 4773 scope.go:117] "RemoveContainer" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" Jan 20 19:05:15 crc kubenswrapper[4773]: E0120 19:05:15.003191 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627\": container with ID starting with 5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627 not found: ID does not exist" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003231 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627"} err="failed to get container status \"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627\": rpc error: code = NotFound desc = could not find container \"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627\": container with ID starting with 5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627 not found: ID does not exist" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003256 4773 scope.go:117] "RemoveContainer" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" Jan 20 19:05:15 crc kubenswrapper[4773]: E0120 19:05:15.003614 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47\": container with ID starting with 0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47 not found: ID does not exist" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003644 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47"} err="failed to get container status \"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47\": rpc error: code = NotFound desc = could not find container \"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47\": container with ID starting with 0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47 not found: ID does not exist" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003666 4773 scope.go:117] "RemoveContainer" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" Jan 20 19:05:15 crc kubenswrapper[4773]: E0120 19:05:15.004016 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855\": container with ID starting with c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855 not found: ID does not exist" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.004038 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855"} err="failed to get container status \"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855\": rpc error: code = NotFound desc = could not find container \"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855\": container with ID starting with c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855 not found: ID does not exist" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.465313 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" path="/var/lib/kubelet/pods/461d1293-e47b-49e3-a6eb-bef90ad29792/volumes" Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.501353 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.508478 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.517246 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.524721 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.531337 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.538871 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.545765 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.557048 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.564318 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.586911 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.593415 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.601526 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.607715 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.614046 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.620280 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.626963 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.633344 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.639260 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.645703 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.651475 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.457221 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" path="/var/lib/kubelet/pods/02289c77-b6e5-4419-8dc4-597648db0e01/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.457892 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" path="/var/lib/kubelet/pods/0dd5218f-c5ee-4e0b-83bb-ab17d1887596/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.458562 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" path="/var/lib/kubelet/pods/10bff0cd-1771-46dc-87e8-a7ce91f520c8/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.459236 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" path="/var/lib/kubelet/pods/3ef8874c-43f1-43c9-ac7c-0af15c430e89/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.460458 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" path="/var/lib/kubelet/pods/4966c538-33c7-4d94-9705-0081ce04e9ef/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.461247 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" path="/var/lib/kubelet/pods/4be003e8-2c0f-45c8-944d-b126c8cbd1b0/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.462055 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" path="/var/lib/kubelet/pods/5f64745d-73ee-4219-b71f-b08d15f94f68/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.463375 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" path="/var/lib/kubelet/pods/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.464061 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" path="/var/lib/kubelet/pods/8f7fa4e8-571e-47fe-9e86-e83acb77eb77/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.464729 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" path="/var/lib/kubelet/pods/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf/volumes" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.492011 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f"] Jan 20 19:05:29 crc kubenswrapper[4773]: E0120 19:05:29.492945 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-content" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.492959 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-content" Jan 20 19:05:29 crc kubenswrapper[4773]: E0120 19:05:29.492985 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-utilities" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.492992 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-utilities" Jan 20 19:05:29 crc kubenswrapper[4773]: E0120 19:05:29.493010 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.493018 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.493177 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.493719 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497269 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497391 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497505 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497648 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497776 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.516282 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f"] Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.626895 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627127 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627164 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627295 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.729567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.729646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.729920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.730034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.730243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.736775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.737295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.737367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.738373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.752660 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.829648 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:30 crc kubenswrapper[4773]: I0120 19:05:30.376534 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f"] Jan 20 19:05:31 crc kubenswrapper[4773]: I0120 19:05:31.048017 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerStarted","Data":"372659c8842895c482d67b7bdb9f52a4ec7c84574d3e91a70ac602c3de73c56d"} Jan 20 19:05:32 crc kubenswrapper[4773]: I0120 19:05:32.060638 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerStarted","Data":"80f8a58df81e876e3749166f9c41816150f83154bab6dc495dc5176c4bc6877b"} Jan 20 19:05:32 crc kubenswrapper[4773]: I0120 19:05:32.076899 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" podStartSLOduration=2.627683545 podStartE2EDuration="3.076851939s" podCreationTimestamp="2026-01-20 19:05:29 +0000 UTC" firstStartedPulling="2026-01-20 19:05:30.383873475 +0000 UTC m=+2123.305686499" lastFinishedPulling="2026-01-20 19:05:30.833041869 +0000 UTC m=+2123.754854893" observedRunningTime="2026-01-20 19:05:32.07438654 +0000 UTC m=+2124.996199584" watchObservedRunningTime="2026-01-20 19:05:32.076851939 +0000 UTC m=+2124.998664973" Jan 20 19:05:44 crc kubenswrapper[4773]: I0120 19:05:44.158365 4773 generic.go:334] "Generic (PLEG): container finished" podID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerID="80f8a58df81e876e3749166f9c41816150f83154bab6dc495dc5176c4bc6877b" exitCode=0 Jan 20 19:05:44 crc kubenswrapper[4773]: I0120 19:05:44.158443 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerDied","Data":"80f8a58df81e876e3749166f9c41816150f83154bab6dc495dc5176c4bc6877b"} Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.568725 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.744636 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.744885 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.744966 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.745046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.745081 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.751308 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.751807 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl" (OuterVolumeSpecName: "kube-api-access-mjxgl") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "kube-api-access-mjxgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.758109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph" (OuterVolumeSpecName: "ceph") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.774350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory" (OuterVolumeSpecName: "inventory") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.780168 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849301 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849342 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849394 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849409 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849421 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.186083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerDied","Data":"372659c8842895c482d67b7bdb9f52a4ec7c84574d3e91a70ac602c3de73c56d"} Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.186131 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372659c8842895c482d67b7bdb9f52a4ec7c84574d3e91a70ac602c3de73c56d" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.186194 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.261413 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8"] Jan 20 19:05:46 crc kubenswrapper[4773]: E0120 19:05:46.262094 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.262120 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.262473 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.263385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.265881 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.266089 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.266372 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.268890 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.268982 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.275859 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8"] Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.358974 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359083 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359113 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460524 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460546 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460606 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.464337 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.473451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.473550 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.473498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.476214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.595768 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:47 crc kubenswrapper[4773]: I0120 19:05:47.095818 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8"] Jan 20 19:05:47 crc kubenswrapper[4773]: I0120 19:05:47.194038 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerStarted","Data":"f8aadc01018737dddc771e307839242b89fa962c335d9987e2632713a8b55216"} Jan 20 19:05:48 crc kubenswrapper[4773]: I0120 19:05:48.203549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerStarted","Data":"788cab7679ab106e9f67ece33e92617bd54b57dc7823e0929c8c86dc34e0eb87"} Jan 20 19:05:48 crc kubenswrapper[4773]: I0120 19:05:48.230164 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" podStartSLOduration=1.742245094 podStartE2EDuration="2.230143202s" podCreationTimestamp="2026-01-20 19:05:46 +0000 UTC" firstStartedPulling="2026-01-20 19:05:47.104429073 +0000 UTC m=+2140.026242107" lastFinishedPulling="2026-01-20 19:05:47.592327191 +0000 UTC m=+2140.514140215" observedRunningTime="2026-01-20 19:05:48.224052623 +0000 UTC m=+2141.145865647" watchObservedRunningTime="2026-01-20 19:05:48.230143202 +0000 UTC m=+2141.151956246" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.773691 4773 scope.go:117] "RemoveContainer" containerID="275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.825781 4773 scope.go:117] "RemoveContainer" containerID="a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.867120 4773 scope.go:117] "RemoveContainer" containerID="c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.909349 4773 scope.go:117] "RemoveContainer" containerID="8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.937697 4773 scope.go:117] "RemoveContainer" containerID="214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.976053 4773 scope.go:117] "RemoveContainer" containerID="d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb" Jan 20 19:06:20 crc kubenswrapper[4773]: I0120 19:06:20.016496 4773 scope.go:117] "RemoveContainer" containerID="93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308" Jan 20 19:06:20 crc kubenswrapper[4773]: I0120 19:06:20.054194 4773 scope.go:117] "RemoveContainer" containerID="ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1" Jan 20 19:06:20 crc kubenswrapper[4773]: I0120 19:06:20.121368 4773 scope.go:117] "RemoveContainer" containerID="55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384" Jan 20 19:06:28 crc kubenswrapper[4773]: I0120 19:06:28.169672 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:06:28 crc kubenswrapper[4773]: I0120 19:06:28.171114 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:06:58 crc kubenswrapper[4773]: I0120 19:06:58.170372 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:06:58 crc kubenswrapper[4773]: I0120 19:06:58.171015 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:07:20 crc kubenswrapper[4773]: I0120 19:07:20.272518 4773 scope.go:117] "RemoveContainer" containerID="737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.586798 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.599952 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.600034 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.705842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.706142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.706339 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808218 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808315 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808339 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808854 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.828130 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.925665 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.459686 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.953466 4773 generic.go:334] "Generic (PLEG): container finished" podID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" exitCode=0 Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.953560 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37"} Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.953809 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerStarted","Data":"2b426e4d44dd06eedea970e43035e827898b4744cce96b3f32658e4da9d10aa5"} Jan 20 19:07:23 crc kubenswrapper[4773]: I0120 19:07:23.962698 4773 generic.go:334] "Generic (PLEG): container finished" podID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" exitCode=0 Jan 20 19:07:23 crc kubenswrapper[4773]: I0120 19:07:23.962786 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5"} Jan 20 19:07:24 crc kubenswrapper[4773]: I0120 19:07:24.972172 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerStarted","Data":"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5"} Jan 20 19:07:24 crc kubenswrapper[4773]: I0120 19:07:24.994871 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnf6n" podStartSLOduration=2.386128461 podStartE2EDuration="3.994846616s" podCreationTimestamp="2026-01-20 19:07:21 +0000 UTC" firstStartedPulling="2026-01-20 19:07:22.955973376 +0000 UTC m=+2235.877786400" lastFinishedPulling="2026-01-20 19:07:24.564691531 +0000 UTC m=+2237.486504555" observedRunningTime="2026-01-20 19:07:24.987542426 +0000 UTC m=+2237.909355470" watchObservedRunningTime="2026-01-20 19:07:24.994846616 +0000 UTC m=+2237.916659640" Jan 20 19:07:26 crc kubenswrapper[4773]: I0120 19:07:26.986242 4773 generic.go:334] "Generic (PLEG): container finished" podID="586f1b07-ae25-4acf-8a65-92377c4db234" containerID="788cab7679ab106e9f67ece33e92617bd54b57dc7823e0929c8c86dc34e0eb87" exitCode=0 Jan 20 19:07:26 crc kubenswrapper[4773]: I0120 19:07:26.986333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerDied","Data":"788cab7679ab106e9f67ece33e92617bd54b57dc7823e0929c8c86dc34e0eb87"} Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.196308 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.196391 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.196441 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.197484 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.197545 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" gracePeriod=600 Jan 20 19:07:28 crc kubenswrapper[4773]: E0120 19:07:28.322776 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.405441 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.537965 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538076 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538257 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538307 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.543685 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph" (OuterVolumeSpecName: "ceph") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.543796 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.544069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv" (OuterVolumeSpecName: "kube-api-access-z4rdv") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "kube-api-access-z4rdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.562687 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.563280 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory" (OuterVolumeSpecName: "inventory") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640087 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640115 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640125 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640134 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640142 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.010008 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" exitCode=0 Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.010070 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1"} Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.010379 4773 scope.go:117] "RemoveContainer" containerID="7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.011159 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:07:29 crc kubenswrapper[4773]: E0120 19:07:29.011593 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.013656 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerDied","Data":"f8aadc01018737dddc771e307839242b89fa962c335d9987e2632713a8b55216"} Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.013694 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8aadc01018737dddc771e307839242b89fa962c335d9987e2632713a8b55216" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.015055 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.107387 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm"] Jan 20 19:07:29 crc kubenswrapper[4773]: E0120 19:07:29.107880 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586f1b07-ae25-4acf-8a65-92377c4db234" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.107904 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="586f1b07-ae25-4acf-8a65-92377c4db234" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.108143 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="586f1b07-ae25-4acf-8a65-92377c4db234" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.108946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.113488 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.113745 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.113880 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.114171 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.116591 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.121863 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm"] Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147685 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249608 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.254146 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.254358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.255499 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.264595 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.438517 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.951849 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm"] Jan 20 19:07:30 crc kubenswrapper[4773]: I0120 19:07:30.026435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerStarted","Data":"eb23a023b2ee7db79ca1f3f19c6a08cc980f5ebaa1760abf62846d8e16bae762"} Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.037304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerStarted","Data":"2443e916717c02c1bd3b3f4d20d0235dee65b1d8f082a086a4c2b9b4fd541514"} Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.061597 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" podStartSLOduration=1.547008983 podStartE2EDuration="2.061575554s" podCreationTimestamp="2026-01-20 19:07:29 +0000 UTC" firstStartedPulling="2026-01-20 19:07:29.963334169 +0000 UTC m=+2242.885147223" lastFinishedPulling="2026-01-20 19:07:30.47790077 +0000 UTC m=+2243.399713794" observedRunningTime="2026-01-20 19:07:31.055829524 +0000 UTC m=+2243.977642588" watchObservedRunningTime="2026-01-20 19:07:31.061575554 +0000 UTC m=+2243.983388588" Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.926531 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.926807 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.969464 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:32 crc kubenswrapper[4773]: I0120 19:07:32.086638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:32 crc kubenswrapper[4773]: I0120 19:07:32.209545 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.069392 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnf6n" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" containerID="cri-o://44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" gracePeriod=2 Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.488787 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.571147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.571318 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.571536 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.575627 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities" (OuterVolumeSpecName: "utilities") pod "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" (UID: "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.576683 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll" (OuterVolumeSpecName: "kube-api-access-vd8ll") pod "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" (UID: "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6"). InnerVolumeSpecName "kube-api-access-vd8ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.658865 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" (UID: "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.673903 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.673962 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.673972 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079858 4773 generic.go:334] "Generic (PLEG): container finished" podID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" exitCode=0 Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5"} Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"2b426e4d44dd06eedea970e43035e827898b4744cce96b3f32658e4da9d10aa5"} Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079942 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079958 4773 scope.go:117] "RemoveContainer" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.104366 4773 scope.go:117] "RemoveContainer" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.113365 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.121357 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.137515 4773 scope.go:117] "RemoveContainer" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.164572 4773 scope.go:117] "RemoveContainer" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" Jan 20 19:07:35 crc kubenswrapper[4773]: E0120 19:07:35.165080 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5\": container with ID starting with 44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5 not found: ID does not exist" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165123 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5"} err="failed to get container status \"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5\": rpc error: code = NotFound desc = could not find container \"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5\": container with ID starting with 44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5 not found: ID does not exist" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165171 4773 scope.go:117] "RemoveContainer" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" Jan 20 19:07:35 crc kubenswrapper[4773]: E0120 19:07:35.165485 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5\": container with ID starting with 27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5 not found: ID does not exist" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165525 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5"} err="failed to get container status \"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5\": rpc error: code = NotFound desc = could not find container \"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5\": container with ID starting with 27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5 not found: ID does not exist" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165549 4773 scope.go:117] "RemoveContainer" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" Jan 20 19:07:35 crc kubenswrapper[4773]: E0120 19:07:35.165761 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37\": container with ID starting with 25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37 not found: ID does not exist" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165801 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37"} err="failed to get container status \"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37\": rpc error: code = NotFound desc = could not find container \"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37\": container with ID starting with 25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37 not found: ID does not exist" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.461645 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" path="/var/lib/kubelet/pods/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6/volumes" Jan 20 19:07:42 crc kubenswrapper[4773]: I0120 19:07:42.449669 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:07:42 crc kubenswrapper[4773]: E0120 19:07:42.450492 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:54 crc kubenswrapper[4773]: I0120 19:07:54.447918 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:07:54 crc kubenswrapper[4773]: E0120 19:07:54.450237 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:55 crc kubenswrapper[4773]: I0120 19:07:55.276842 4773 generic.go:334] "Generic (PLEG): container finished" podID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerID="2443e916717c02c1bd3b3f4d20d0235dee65b1d8f082a086a4c2b9b4fd541514" exitCode=0 Jan 20 19:07:55 crc kubenswrapper[4773]: I0120 19:07:55.276884 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerDied","Data":"2443e916717c02c1bd3b3f4d20d0235dee65b1d8f082a086a4c2b9b4fd541514"} Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.689355 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.781970 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.782046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.782096 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.782232 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.787766 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g" (OuterVolumeSpecName: "kube-api-access-htg4g") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "kube-api-access-htg4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.788069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph" (OuterVolumeSpecName: "ceph") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.808513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.811069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory" (OuterVolumeSpecName: "inventory") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883777 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883806 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883820 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883833 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.310572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerDied","Data":"eb23a023b2ee7db79ca1f3f19c6a08cc980f5ebaa1760abf62846d8e16bae762"} Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.310642 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb23a023b2ee7db79ca1f3f19c6a08cc980f5ebaa1760abf62846d8e16bae762" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.310733 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.382255 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl"] Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383255 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383279 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383299 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-content" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383306 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-content" Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383324 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383336 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383351 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-utilities" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383359 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-utilities" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383611 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383634 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.384438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.389679 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.390024 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.390170 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.393238 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.393378 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.398542 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl"] Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501297 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501395 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.603647 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.604102 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.604467 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.604800 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.609840 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.610075 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.611881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.622905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.703697 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:58 crc kubenswrapper[4773]: I0120 19:07:58.216295 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl"] Jan 20 19:07:58 crc kubenswrapper[4773]: I0120 19:07:58.319515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerStarted","Data":"84ccae1731492f36f27032bf4fdcd471087a7fba072bc56caf5eecbcef82796f"} Jan 20 19:07:59 crc kubenswrapper[4773]: I0120 19:07:59.332427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerStarted","Data":"19321b41aa7d9fc5a04b9a1c384913e6b0d9ca3c0b752d366cdf04cfc8e54db0"} Jan 20 19:07:59 crc kubenswrapper[4773]: I0120 19:07:59.357068 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" podStartSLOduration=1.792651108 podStartE2EDuration="2.35704771s" podCreationTimestamp="2026-01-20 19:07:57 +0000 UTC" firstStartedPulling="2026-01-20 19:07:58.22107237 +0000 UTC m=+2271.142885394" lastFinishedPulling="2026-01-20 19:07:58.785468972 +0000 UTC m=+2271.707281996" observedRunningTime="2026-01-20 19:07:59.349145236 +0000 UTC m=+2272.270958290" watchObservedRunningTime="2026-01-20 19:07:59.35704771 +0000 UTC m=+2272.278860734" Jan 20 19:08:04 crc kubenswrapper[4773]: I0120 19:08:04.372311 4773 generic.go:334] "Generic (PLEG): container finished" podID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerID="19321b41aa7d9fc5a04b9a1c384913e6b0d9ca3c0b752d366cdf04cfc8e54db0" exitCode=0 Jan 20 19:08:04 crc kubenswrapper[4773]: I0120 19:08:04.372426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerDied","Data":"19321b41aa7d9fc5a04b9a1c384913e6b0d9ca3c0b752d366cdf04cfc8e54db0"} Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.774560 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.958904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.959063 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.959116 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.959207 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.964657 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph" (OuterVolumeSpecName: "ceph") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.966425 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd" (OuterVolumeSpecName: "kube-api-access-59kzd") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "kube-api-access-59kzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.984618 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory" (OuterVolumeSpecName: "inventory") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.009328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061315 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061360 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061373 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061384 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.388303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerDied","Data":"84ccae1731492f36f27032bf4fdcd471087a7fba072bc56caf5eecbcef82796f"} Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.388339 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84ccae1731492f36f27032bf4fdcd471087a7fba072bc56caf5eecbcef82796f" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.388653 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.466172 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl"] Jan 20 19:08:06 crc kubenswrapper[4773]: E0120 19:08:06.466587 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.466613 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.466847 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.474439 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.477375 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.477823 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.478294 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.479054 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.479409 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.479626 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl"] Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.570628 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.570789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.571154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.571238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.673581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.674028 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.674483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.676526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.677562 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.679360 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.681395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.692427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.795705 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.316246 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl"] Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.396496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerStarted","Data":"6667a757ef94bda49015d31ad7f83f3febee632cb2bdbfa4b633256b03dab1fe"} Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.455198 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:07 crc kubenswrapper[4773]: E0120 19:08:07.455436 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.833519 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:08 crc kubenswrapper[4773]: I0120 19:08:08.405745 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerStarted","Data":"016b7161255906b165381005e4b64f481adda489350b3e8c81d4fc7dd7ae683a"} Jan 20 19:08:08 crc kubenswrapper[4773]: I0120 19:08:08.429386 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" podStartSLOduration=1.927866182 podStartE2EDuration="2.429360363s" podCreationTimestamp="2026-01-20 19:08:06 +0000 UTC" firstStartedPulling="2026-01-20 19:08:07.329379006 +0000 UTC m=+2280.251192030" lastFinishedPulling="2026-01-20 19:08:07.830873187 +0000 UTC m=+2280.752686211" observedRunningTime="2026-01-20 19:08:08.422574146 +0000 UTC m=+2281.344387260" watchObservedRunningTime="2026-01-20 19:08:08.429360363 +0000 UTC m=+2281.351173397" Jan 20 19:08:20 crc kubenswrapper[4773]: I0120 19:08:20.447636 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:20 crc kubenswrapper[4773]: E0120 19:08:20.448777 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:33 crc kubenswrapper[4773]: I0120 19:08:33.447164 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:33 crc kubenswrapper[4773]: E0120 19:08:33.449341 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:43 crc kubenswrapper[4773]: I0120 19:08:43.703927 4773 generic.go:334] "Generic (PLEG): container finished" podID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerID="016b7161255906b165381005e4b64f481adda489350b3e8c81d4fc7dd7ae683a" exitCode=0 Jan 20 19:08:43 crc kubenswrapper[4773]: I0120 19:08:43.703978 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerDied","Data":"016b7161255906b165381005e4b64f481adda489350b3e8c81d4fc7dd7ae683a"} Jan 20 19:08:44 crc kubenswrapper[4773]: I0120 19:08:44.446643 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:44 crc kubenswrapper[4773]: E0120 19:08:44.446926 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.148463 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.268686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.268776 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.268811 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.269130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.275259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph" (OuterVolumeSpecName: "ceph") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.275710 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4" (OuterVolumeSpecName: "kube-api-access-7c8z4") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "kube-api-access-7c8z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.302847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.334539 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory" (OuterVolumeSpecName: "inventory") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372065 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372150 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372182 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372206 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.726588 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerDied","Data":"6667a757ef94bda49015d31ad7f83f3febee632cb2bdbfa4b633256b03dab1fe"} Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.727019 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6667a757ef94bda49015d31ad7f83f3febee632cb2bdbfa4b633256b03dab1fe" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.726657 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.813857 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7"] Jan 20 19:08:45 crc kubenswrapper[4773]: E0120 19:08:45.814288 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.814313 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.814559 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.815265 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.817788 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.817821 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.818065 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.818491 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.818668 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.825833 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7"] Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982183 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982296 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084293 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.088752 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.091356 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.092296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.106237 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.136715 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.720580 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7"] Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.741646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerStarted","Data":"592f2227e942f38ab81d1066b0a53c57a17c1f412459f7373db45e35dfd1771d"} Jan 20 19:08:47 crc kubenswrapper[4773]: I0120 19:08:47.750624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerStarted","Data":"22471ae3f8f072cbd2a2544fc9553d7c28a9ea536f169a3ec909534c93bb48ba"} Jan 20 19:08:47 crc kubenswrapper[4773]: I0120 19:08:47.770542 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" podStartSLOduration=2.201763887 podStartE2EDuration="2.770520747s" podCreationTimestamp="2026-01-20 19:08:45 +0000 UTC" firstStartedPulling="2026-01-20 19:08:46.724715946 +0000 UTC m=+2319.646528970" lastFinishedPulling="2026-01-20 19:08:47.293472806 +0000 UTC m=+2320.215285830" observedRunningTime="2026-01-20 19:08:47.767422592 +0000 UTC m=+2320.689235686" watchObservedRunningTime="2026-01-20 19:08:47.770520747 +0000 UTC m=+2320.692333791" Jan 20 19:08:51 crc kubenswrapper[4773]: I0120 19:08:51.785704 4773 generic.go:334] "Generic (PLEG): container finished" podID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerID="22471ae3f8f072cbd2a2544fc9553d7c28a9ea536f169a3ec909534c93bb48ba" exitCode=0 Jan 20 19:08:51 crc kubenswrapper[4773]: I0120 19:08:51.785781 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerDied","Data":"22471ae3f8f072cbd2a2544fc9553d7c28a9ea536f169a3ec909534c93bb48ba"} Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.199738 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335629 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335880 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335961 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.341522 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph" (OuterVolumeSpecName: "ceph") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.341832 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7" (OuterVolumeSpecName: "kube-api-access-lqff7") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "kube-api-access-lqff7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.365276 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.367141 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory" (OuterVolumeSpecName: "inventory") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438516 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438551 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438565 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438578 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.806923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerDied","Data":"592f2227e942f38ab81d1066b0a53c57a17c1f412459f7373db45e35dfd1771d"} Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.807204 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="592f2227e942f38ab81d1066b0a53c57a17c1f412459f7373db45e35dfd1771d" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.807014 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.867173 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5"] Jan 20 19:08:53 crc kubenswrapper[4773]: E0120 19:08:53.867568 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.867585 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.867755 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.869459 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872062 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872102 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872175 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872759 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.882565 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5"] Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.048854 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.049025 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.049211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.049293 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.150965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.151357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.151415 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.151469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.157508 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.160902 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.165482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.168437 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.195602 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.672713 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5"] Jan 20 19:08:54 crc kubenswrapper[4773]: W0120 19:08:54.688092 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3ce8585_331b_44ef_b8f8_aa5cb3b96589.slice/crio-65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818 WatchSource:0}: Error finding container 65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818: Status 404 returned error can't find the container with id 65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818 Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.816290 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerStarted","Data":"65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818"} Jan 20 19:08:55 crc kubenswrapper[4773]: I0120 19:08:55.825770 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerStarted","Data":"0acc32f3dd00ece65c362ed4f79e75fc710f060db3865b6e446c5263e2e0a67d"} Jan 20 19:08:55 crc kubenswrapper[4773]: I0120 19:08:55.845801 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" podStartSLOduration=2.339451274 podStartE2EDuration="2.845784413s" podCreationTimestamp="2026-01-20 19:08:53 +0000 UTC" firstStartedPulling="2026-01-20 19:08:54.693260669 +0000 UTC m=+2327.615073693" lastFinishedPulling="2026-01-20 19:08:55.199593808 +0000 UTC m=+2328.121406832" observedRunningTime="2026-01-20 19:08:55.839618822 +0000 UTC m=+2328.761431866" watchObservedRunningTime="2026-01-20 19:08:55.845784413 +0000 UTC m=+2328.767597437" Jan 20 19:08:58 crc kubenswrapper[4773]: I0120 19:08:58.446846 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:58 crc kubenswrapper[4773]: E0120 19:08:58.447420 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:12 crc kubenswrapper[4773]: I0120 19:09:12.447907 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:12 crc kubenswrapper[4773]: E0120 19:09:12.448731 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:24 crc kubenswrapper[4773]: I0120 19:09:24.447225 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:24 crc kubenswrapper[4773]: E0120 19:09:24.448073 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:36 crc kubenswrapper[4773]: I0120 19:09:36.110368 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerID="0acc32f3dd00ece65c362ed4f79e75fc710f060db3865b6e446c5263e2e0a67d" exitCode=0 Jan 20 19:09:36 crc kubenswrapper[4773]: I0120 19:09:36.110444 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerDied","Data":"0acc32f3dd00ece65c362ed4f79e75fc710f060db3865b6e446c5263e2e0a67d"} Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.541382 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651713 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651759 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651798 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.663433 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr" (OuterVolumeSpecName: "kube-api-access-55gfr") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "kube-api-access-55gfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.664053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph" (OuterVolumeSpecName: "ceph") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.676316 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory" (OuterVolumeSpecName: "inventory") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.676820 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753436 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753469 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753480 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753490 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.127269 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerDied","Data":"65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818"} Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.127598 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.127336 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.235334 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ll277"] Jan 20 19:09:38 crc kubenswrapper[4773]: E0120 19:09:38.235687 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.235706 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.235870 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.236435 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.238540 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.238705 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.253898 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.254252 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.255475 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.285310 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ll277"] Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366613 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366795 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.468778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.468878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.468992 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.469037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.473820 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.481579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.484563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.485247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.572848 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.123712 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ll277"] Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.127611 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.136637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerStarted","Data":"d13ed8d40e9582ffcef7fc4c21289af702cbe628dae9f26e58eec56e94666466"} Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.447051 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:39 crc kubenswrapper[4773]: E0120 19:09:39.447319 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:41 crc kubenswrapper[4773]: I0120 19:09:41.155173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerStarted","Data":"cfb2353f9c2f582ca875e48ff20e38dfbd1c2f94eb56472b1e9fca947ff25c9e"} Jan 20 19:09:41 crc kubenswrapper[4773]: I0120 19:09:41.180574 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" podStartSLOduration=2.649382788 podStartE2EDuration="3.180554343s" podCreationTimestamp="2026-01-20 19:09:38 +0000 UTC" firstStartedPulling="2026-01-20 19:09:39.127404205 +0000 UTC m=+2372.049217219" lastFinishedPulling="2026-01-20 19:09:39.65857576 +0000 UTC m=+2372.580388774" observedRunningTime="2026-01-20 19:09:41.176122657 +0000 UTC m=+2374.097935691" watchObservedRunningTime="2026-01-20 19:09:41.180554343 +0000 UTC m=+2374.102367367" Jan 20 19:09:49 crc kubenswrapper[4773]: I0120 19:09:49.231788 4773 generic.go:334] "Generic (PLEG): container finished" podID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerID="cfb2353f9c2f582ca875e48ff20e38dfbd1c2f94eb56472b1e9fca947ff25c9e" exitCode=0 Jan 20 19:09:49 crc kubenswrapper[4773]: I0120 19:09:49.231894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerDied","Data":"cfb2353f9c2f582ca875e48ff20e38dfbd1c2f94eb56472b1e9fca947ff25c9e"} Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.637059 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691560 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691713 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691863 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691915 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.697351 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52" (OuterVolumeSpecName: "kube-api-access-k4q52") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "kube-api-access-k4q52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.698216 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph" (OuterVolumeSpecName: "ceph") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.715979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.720997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794198 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794231 4773 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794241 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794249 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.251297 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerDied","Data":"d13ed8d40e9582ffcef7fc4c21289af702cbe628dae9f26e58eec56e94666466"} Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.251368 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13ed8d40e9582ffcef7fc4c21289af702cbe628dae9f26e58eec56e94666466" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.251436 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.323684 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn"] Jan 20 19:09:51 crc kubenswrapper[4773]: E0120 19:09:51.324061 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.324077 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.324227 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.324827 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.329496 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.329755 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.329899 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.330160 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.330278 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.347313 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn"] Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405364 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405470 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.447013 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:51 crc kubenswrapper[4773]: E0120 19:09:51.447395 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507048 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507196 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.511883 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.512697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.513792 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.532578 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.642771 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:52 crc kubenswrapper[4773]: I0120 19:09:52.138089 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn"] Jan 20 19:09:52 crc kubenswrapper[4773]: I0120 19:09:52.258116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerStarted","Data":"950b86a0c5c959fdced87b6cf414f0dc68a82da526f0be6060c93be569f7e0bc"} Jan 20 19:09:53 crc kubenswrapper[4773]: I0120 19:09:53.278492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerStarted","Data":"9311d432c3ea715b7c3cf0d32b5476afd429ad99321b459a1e1b66c47e31fd6e"} Jan 20 19:09:53 crc kubenswrapper[4773]: I0120 19:09:53.298799 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" podStartSLOduration=1.860882224 podStartE2EDuration="2.298778642s" podCreationTimestamp="2026-01-20 19:09:51 +0000 UTC" firstStartedPulling="2026-01-20 19:09:52.143995855 +0000 UTC m=+2385.065808879" lastFinishedPulling="2026-01-20 19:09:52.581892273 +0000 UTC m=+2385.503705297" observedRunningTime="2026-01-20 19:09:53.296920738 +0000 UTC m=+2386.218733782" watchObservedRunningTime="2026-01-20 19:09:53.298778642 +0000 UTC m=+2386.220591686" Jan 20 19:10:00 crc kubenswrapper[4773]: I0120 19:10:00.330797 4773 generic.go:334] "Generic (PLEG): container finished" podID="617e6a58-e676-42e3-a897-939d9072d030" containerID="9311d432c3ea715b7c3cf0d32b5476afd429ad99321b459a1e1b66c47e31fd6e" exitCode=0 Jan 20 19:10:00 crc kubenswrapper[4773]: I0120 19:10:00.330883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerDied","Data":"9311d432c3ea715b7c3cf0d32b5476afd429ad99321b459a1e1b66c47e31fd6e"} Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.726503 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.900770 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.900849 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.901005 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.901972 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.907093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph" (OuterVolumeSpecName: "ceph") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.907156 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz" (OuterVolumeSpecName: "kube-api-access-c65cz") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "kube-api-access-c65cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.926436 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.937109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory" (OuterVolumeSpecName: "inventory") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004437 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004471 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004483 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004494 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.347073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerDied","Data":"950b86a0c5c959fdced87b6cf414f0dc68a82da526f0be6060c93be569f7e0bc"} Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.347119 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950b86a0c5c959fdced87b6cf414f0dc68a82da526f0be6060c93be569f7e0bc" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.347150 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.423830 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst"] Jan 20 19:10:02 crc kubenswrapper[4773]: E0120 19:10:02.424260 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617e6a58-e676-42e3-a897-939d9072d030" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.424281 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="617e6a58-e676-42e3-a897-939d9072d030" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.424451 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="617e6a58-e676-42e3-a897-939d9072d030" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.425024 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.426979 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427005 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427529 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427657 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427732 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.444619 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst"] Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.617713 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.618079 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.618159 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.618892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720415 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720605 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.726897 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.728463 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.737037 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.739817 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.743776 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:03 crc kubenswrapper[4773]: I0120 19:10:03.059680 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst"] Jan 20 19:10:03 crc kubenswrapper[4773]: I0120 19:10:03.356290 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerStarted","Data":"5bbd0e74acf506ed6622468cbf7e5b29c4130fa256991e1e438571cadcfc1e49"} Jan 20 19:10:03 crc kubenswrapper[4773]: I0120 19:10:03.448436 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:03 crc kubenswrapper[4773]: E0120 19:10:03.449198 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:04 crc kubenswrapper[4773]: I0120 19:10:04.364089 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerStarted","Data":"4ba0ca9d5878b3eca52770560fb6bc15d84a62adc1124ab29080ea9ae4ab4ba1"} Jan 20 19:10:04 crc kubenswrapper[4773]: I0120 19:10:04.384421 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" podStartSLOduration=1.900983171 podStartE2EDuration="2.384400893s" podCreationTimestamp="2026-01-20 19:10:02 +0000 UTC" firstStartedPulling="2026-01-20 19:10:03.063208064 +0000 UTC m=+2395.985021088" lastFinishedPulling="2026-01-20 19:10:03.546625786 +0000 UTC m=+2396.468438810" observedRunningTime="2026-01-20 19:10:04.382239291 +0000 UTC m=+2397.304052315" watchObservedRunningTime="2026-01-20 19:10:04.384400893 +0000 UTC m=+2397.306213917" Jan 20 19:10:13 crc kubenswrapper[4773]: I0120 19:10:13.436345 4773 generic.go:334] "Generic (PLEG): container finished" podID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerID="4ba0ca9d5878b3eca52770560fb6bc15d84a62adc1124ab29080ea9ae4ab4ba1" exitCode=0 Jan 20 19:10:13 crc kubenswrapper[4773]: I0120 19:10:13.436420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerDied","Data":"4ba0ca9d5878b3eca52770560fb6bc15d84a62adc1124ab29080ea9ae4ab4ba1"} Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.843615 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938721 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938842 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938977 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.945184 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph" (OuterVolumeSpecName: "ceph") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.949121 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt" (OuterVolumeSpecName: "kube-api-access-w6mvt") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "kube-api-access-w6mvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.963250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.965378 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory" (OuterVolumeSpecName: "inventory") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040203 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040244 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040257 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040268 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.447179 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:15 crc kubenswrapper[4773]: E0120 19:10:15.447512 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.456620 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.457387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerDied","Data":"5bbd0e74acf506ed6622468cbf7e5b29c4130fa256991e1e438571cadcfc1e49"} Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.457431 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bbd0e74acf506ed6622468cbf7e5b29c4130fa256991e1e438571cadcfc1e49" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.529618 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv"] Jan 20 19:10:15 crc kubenswrapper[4773]: E0120 19:10:15.530098 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.530124 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.530334 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.531371 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.536655 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.536791 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.536867 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537154 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537301 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537340 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537777 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.542216 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.545955 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv"] Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549894 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550067 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550135 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550187 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550537 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651016 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651337 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651396 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651427 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651461 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651478 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651501 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651554 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651576 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.656261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.656441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658430 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658719 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658763 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.659673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.659870 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.660723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.660993 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.667609 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.668713 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.671708 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.882379 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:16 crc kubenswrapper[4773]: I0120 19:10:16.387709 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv"] Jan 20 19:10:16 crc kubenswrapper[4773]: W0120 19:10:16.391287 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda459169f_671f_4dd7_96d3_019d59bd14c6.slice/crio-952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83 WatchSource:0}: Error finding container 952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83: Status 404 returned error can't find the container with id 952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83 Jan 20 19:10:16 crc kubenswrapper[4773]: I0120 19:10:16.468154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerStarted","Data":"952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83"} Jan 20 19:10:17 crc kubenswrapper[4773]: I0120 19:10:17.480335 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerStarted","Data":"641986abc5e040590e46fd96a9229415224b670542d1c97de29d2e30b83e09b4"} Jan 20 19:10:17 crc kubenswrapper[4773]: I0120 19:10:17.507781 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" podStartSLOduration=1.964709439 podStartE2EDuration="2.507761177s" podCreationTimestamp="2026-01-20 19:10:15 +0000 UTC" firstStartedPulling="2026-01-20 19:10:16.394515556 +0000 UTC m=+2409.316328580" lastFinishedPulling="2026-01-20 19:10:16.937567294 +0000 UTC m=+2409.859380318" observedRunningTime="2026-01-20 19:10:17.500893261 +0000 UTC m=+2410.422706285" watchObservedRunningTime="2026-01-20 19:10:17.507761177 +0000 UTC m=+2410.429574201" Jan 20 19:10:28 crc kubenswrapper[4773]: I0120 19:10:28.447720 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:28 crc kubenswrapper[4773]: E0120 19:10:28.448614 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:42 crc kubenswrapper[4773]: I0120 19:10:42.447263 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:42 crc kubenswrapper[4773]: E0120 19:10:42.448188 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:45 crc kubenswrapper[4773]: I0120 19:10:45.709174 4773 generic.go:334] "Generic (PLEG): container finished" podID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerID="641986abc5e040590e46fd96a9229415224b670542d1c97de29d2e30b83e09b4" exitCode=0 Jan 20 19:10:45 crc kubenswrapper[4773]: I0120 19:10:45.709271 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerDied","Data":"641986abc5e040590e46fd96a9229415224b670542d1c97de29d2e30b83e09b4"} Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.156565 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341165 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341197 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341227 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341247 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341303 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341333 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341355 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341375 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341394 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341416 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341461 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.347768 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.347997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.348130 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc" (OuterVolumeSpecName: "kube-api-access-rlzzc") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "kube-api-access-rlzzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.348426 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.350582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.350740 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.351168 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.352513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.359224 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.381711 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph" (OuterVolumeSpecName: "ceph") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.387817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.389819 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory" (OuterVolumeSpecName: "inventory") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.395757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444036 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444072 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444086 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444122 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444137 4773 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444151 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444162 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444173 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444184 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444195 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444206 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444218 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444229 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.727148 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerDied","Data":"952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83"} Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.727191 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.727205 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.816763 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx"] Jan 20 19:10:47 crc kubenswrapper[4773]: E0120 19:10:47.817147 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.817166 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.817349 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.818117 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.820291 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.820643 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.821099 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.821174 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.822465 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.835924 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx"] Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850465 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.951873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.952007 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.952037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.952073 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.957768 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.962001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.962606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.968361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:48 crc kubenswrapper[4773]: I0120 19:10:48.150321 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:48 crc kubenswrapper[4773]: I0120 19:10:48.704997 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx"] Jan 20 19:10:48 crc kubenswrapper[4773]: I0120 19:10:48.742069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerStarted","Data":"c814e9cc31c971fe868d9d1571c8e81a1d8cd2b4b2d3a60ea51aa2560b8b1cf5"} Jan 20 19:10:49 crc kubenswrapper[4773]: I0120 19:10:49.752623 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerStarted","Data":"5cc56a4f991529db2099d60965ba72e6b1041ff2289f0d4acdfaf630f922601c"} Jan 20 19:10:54 crc kubenswrapper[4773]: I0120 19:10:54.447302 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:54 crc kubenswrapper[4773]: E0120 19:10:54.448400 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:54 crc kubenswrapper[4773]: I0120 19:10:54.789295 4773 generic.go:334] "Generic (PLEG): container finished" podID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerID="5cc56a4f991529db2099d60965ba72e6b1041ff2289f0d4acdfaf630f922601c" exitCode=0 Jan 20 19:10:54 crc kubenswrapper[4773]: I0120 19:10:54.789334 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerDied","Data":"5cc56a4f991529db2099d60965ba72e6b1041ff2289f0d4acdfaf630f922601c"} Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.204828 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394610 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394657 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394731 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.400293 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc" (OuterVolumeSpecName: "kube-api-access-j22gc") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "kube-api-access-j22gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.410467 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph" (OuterVolumeSpecName: "ceph") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.420497 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory" (OuterVolumeSpecName: "inventory") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.421067 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497193 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497454 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497477 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497489 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.811810 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerDied","Data":"c814e9cc31c971fe868d9d1571c8e81a1d8cd2b4b2d3a60ea51aa2560b8b1cf5"} Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.811850 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c814e9cc31c971fe868d9d1571c8e81a1d8cd2b4b2d3a60ea51aa2560b8b1cf5" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.811863 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.891093 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn"] Jan 20 19:10:56 crc kubenswrapper[4773]: E0120 19:10:56.891487 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.891507 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.891691 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.892288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.894362 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.894640 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899327 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899387 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899444 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899594 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.902764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn"] Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006850 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006945 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.108894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.108983 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109087 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.150586 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.151441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.151611 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.154343 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.154664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.166714 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.210166 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.728104 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn"] Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.819664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerStarted","Data":"551f1ac8c3b91aad90ff1c0057d02892ecb1206cb70a658c8ed2f174e0f3dab2"} Jan 20 19:10:58 crc kubenswrapper[4773]: I0120 19:10:58.828066 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerStarted","Data":"804863e04e23be2fcc7a7aa70b2994554d77c7c80c76f0277374cec5743318f6"} Jan 20 19:10:58 crc kubenswrapper[4773]: I0120 19:10:58.855394 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" podStartSLOduration=2.30635961 podStartE2EDuration="2.855375581s" podCreationTimestamp="2026-01-20 19:10:56 +0000 UTC" firstStartedPulling="2026-01-20 19:10:57.733498309 +0000 UTC m=+2450.655311333" lastFinishedPulling="2026-01-20 19:10:58.28251428 +0000 UTC m=+2451.204327304" observedRunningTime="2026-01-20 19:10:58.84868897 +0000 UTC m=+2451.770501994" watchObservedRunningTime="2026-01-20 19:10:58.855375581 +0000 UTC m=+2451.777188615" Jan 20 19:11:05 crc kubenswrapper[4773]: I0120 19:11:05.448113 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:05 crc kubenswrapper[4773]: E0120 19:11:05.448888 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:20 crc kubenswrapper[4773]: I0120 19:11:20.446752 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:20 crc kubenswrapper[4773]: E0120 19:11:20.448594 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:34 crc kubenswrapper[4773]: I0120 19:11:34.447827 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:34 crc kubenswrapper[4773]: E0120 19:11:34.448714 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:46 crc kubenswrapper[4773]: I0120 19:11:46.447634 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:46 crc kubenswrapper[4773]: E0120 19:11:46.449068 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:57 crc kubenswrapper[4773]: I0120 19:11:57.465042 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:57 crc kubenswrapper[4773]: E0120 19:11:57.465908 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:12:04 crc kubenswrapper[4773]: I0120 19:12:04.345231 4773 generic.go:334] "Generic (PLEG): container finished" podID="de805082-3188-4adb-9607-4ec5535de661" containerID="804863e04e23be2fcc7a7aa70b2994554d77c7c80c76f0277374cec5743318f6" exitCode=0 Jan 20 19:12:04 crc kubenswrapper[4773]: I0120 19:12:04.345288 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerDied","Data":"804863e04e23be2fcc7a7aa70b2994554d77c7c80c76f0277374cec5743318f6"} Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.781676 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934031 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934355 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934469 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.940038 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.940715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh" (OuterVolumeSpecName: "kube-api-access-x7xgh") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "kube-api-access-x7xgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.941057 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph" (OuterVolumeSpecName: "ceph") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.961879 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.964346 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory" (OuterVolumeSpecName: "inventory") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.964469 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036038 4773 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036086 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036099 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036111 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036123 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036140 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.364959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerDied","Data":"551f1ac8c3b91aad90ff1c0057d02892ecb1206cb70a658c8ed2f174e0f3dab2"} Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.365245 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="551f1ac8c3b91aad90ff1c0057d02892ecb1206cb70a658c8ed2f174e0f3dab2" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.365110 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.449470 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc"] Jan 20 19:12:06 crc kubenswrapper[4773]: E0120 19:12:06.449846 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de805082-3188-4adb-9607-4ec5535de661" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.449865 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="de805082-3188-4adb-9607-4ec5535de661" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.450063 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="de805082-3188-4adb-9607-4ec5535de661" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.450787 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457514 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457757 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457806 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457948 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.458187 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.458221 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.458488 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.461018 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc"] Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646970 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.647290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.647507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.647548 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749093 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749131 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749230 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749251 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.755058 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.755258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.755590 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.756309 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.761464 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.761615 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.767600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:07 crc kubenswrapper[4773]: I0120 19:12:07.066625 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:07 crc kubenswrapper[4773]: I0120 19:12:07.586274 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc"] Jan 20 19:12:08 crc kubenswrapper[4773]: I0120 19:12:08.047089 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:12:08 crc kubenswrapper[4773]: I0120 19:12:08.380753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerStarted","Data":"66bea5dbfc8a5f5c64c95a4c86a1563b45902ebd652571952d7fae30fffabeb5"} Jan 20 19:12:08 crc kubenswrapper[4773]: I0120 19:12:08.380794 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerStarted","Data":"a702eed649f35aa329a0e27c986e2002cd7306b161fa775a665fe269809ca0ac"} Jan 20 19:12:12 crc kubenswrapper[4773]: I0120 19:12:12.447202 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:12:12 crc kubenswrapper[4773]: E0120 19:12:12.448067 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:12:24 crc kubenswrapper[4773]: I0120 19:12:24.447141 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:12:24 crc kubenswrapper[4773]: E0120 19:12:24.448111 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:12:35 crc kubenswrapper[4773]: I0120 19:12:35.447524 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:12:36 crc kubenswrapper[4773]: I0120 19:12:36.591882 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b"} Jan 20 19:12:36 crc kubenswrapper[4773]: I0120 19:12:36.613791 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" podStartSLOduration=30.164066101 podStartE2EDuration="30.613765852s" podCreationTimestamp="2026-01-20 19:12:06 +0000 UTC" firstStartedPulling="2026-01-20 19:12:07.59457243 +0000 UTC m=+2520.516385454" lastFinishedPulling="2026-01-20 19:12:08.044272181 +0000 UTC m=+2520.966085205" observedRunningTime="2026-01-20 19:12:08.403094839 +0000 UTC m=+2521.324907883" watchObservedRunningTime="2026-01-20 19:12:36.613765852 +0000 UTC m=+2549.535578876" Jan 20 19:13:06 crc kubenswrapper[4773]: I0120 19:13:06.836600 4773 generic.go:334] "Generic (PLEG): container finished" podID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerID="66bea5dbfc8a5f5c64c95a4c86a1563b45902ebd652571952d7fae30fffabeb5" exitCode=0 Jan 20 19:13:06 crc kubenswrapper[4773]: I0120 19:13:06.836671 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerDied","Data":"66bea5dbfc8a5f5c64c95a4c86a1563b45902ebd652571952d7fae30fffabeb5"} Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.246087 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.269768 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270155 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270206 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270220 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270325 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.303690 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp" (OuterVolumeSpecName: "kube-api-access-sgnbp") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "kube-api-access-sgnbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.304308 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.305109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph" (OuterVolumeSpecName: "ceph") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.307431 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.318446 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.320591 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.351598 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory" (OuterVolumeSpecName: "inventory") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371459 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371495 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371507 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371516 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371526 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371534 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371544 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.853323 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerDied","Data":"a702eed649f35aa329a0e27c986e2002cd7306b161fa775a665fe269809ca0ac"} Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.853670 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a702eed649f35aa329a0e27c986e2002cd7306b161fa775a665fe269809ca0ac" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.853394 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.025917 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9"] Jan 20 19:13:09 crc kubenswrapper[4773]: E0120 19:13:09.026307 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.026332 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.026541 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.027087 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.030523 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.030847 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031116 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031215 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031217 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.050524 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9"] Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081706 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081769 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.082556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.082637 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.183616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184203 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184472 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.187276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.187305 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.187541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.188089 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.194508 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.201601 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.348108 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.909219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9"] Jan 20 19:13:10 crc kubenswrapper[4773]: I0120 19:13:10.868687 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerStarted","Data":"e53a39dda91cd4424e1be373936eb9332c7149e3a73c3d2efb4770fc0d55b04a"} Jan 20 19:13:10 crc kubenswrapper[4773]: I0120 19:13:10.868723 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerStarted","Data":"4fb455accb7c642045b9830c42a1eef01ec071caf8bd86c446dbea2234b516db"} Jan 20 19:13:10 crc kubenswrapper[4773]: I0120 19:13:10.888902 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" podStartSLOduration=1.218880168 podStartE2EDuration="1.888880877s" podCreationTimestamp="2026-01-20 19:13:09 +0000 UTC" firstStartedPulling="2026-01-20 19:13:09.917340629 +0000 UTC m=+2582.839153653" lastFinishedPulling="2026-01-20 19:13:10.587341338 +0000 UTC m=+2583.509154362" observedRunningTime="2026-01-20 19:13:10.885488006 +0000 UTC m=+2583.807301030" watchObservedRunningTime="2026-01-20 19:13:10.888880877 +0000 UTC m=+2583.810693901" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.417807 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.420371 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.428835 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.609497 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.609575 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.609602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711078 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711763 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.712010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.748408 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.748910 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.195479 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.440784 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerID="a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4" exitCode=0 Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.440835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4"} Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.440904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerStarted","Data":"fbf9aeb5648d8d46e58a5c7ca6f6106654eda90c176d1321e2b4db729aa24b85"} Jan 20 19:14:22 crc kubenswrapper[4773]: E0120 19:14:22.501577 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9dc5be9_7ed4_4735_8fa8_42cdd10b8a9c.slice/crio-conmon-a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4.scope\": RecentStats: unable to find data in memory cache]" Jan 20 19:14:24 crc kubenswrapper[4773]: I0120 19:14:24.468731 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerID="1b51815bed658450c0ab9229613f8d80ebce730e02b6a3417d5216ee6bd4c737" exitCode=0 Jan 20 19:14:24 crc kubenswrapper[4773]: I0120 19:14:24.469205 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"1b51815bed658450c0ab9229613f8d80ebce730e02b6a3417d5216ee6bd4c737"} Jan 20 19:14:25 crc kubenswrapper[4773]: I0120 19:14:25.480594 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerStarted","Data":"a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b"} Jan 20 19:14:25 crc kubenswrapper[4773]: I0120 19:14:25.498342 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2n25g" podStartSLOduration=1.94270562 podStartE2EDuration="4.498320873s" podCreationTimestamp="2026-01-20 19:14:21 +0000 UTC" firstStartedPulling="2026-01-20 19:14:22.44319116 +0000 UTC m=+2655.365004184" lastFinishedPulling="2026-01-20 19:14:24.998806413 +0000 UTC m=+2657.920619437" observedRunningTime="2026-01-20 19:14:25.496354486 +0000 UTC m=+2658.418167510" watchObservedRunningTime="2026-01-20 19:14:25.498320873 +0000 UTC m=+2658.420133897" Jan 20 19:14:31 crc kubenswrapper[4773]: I0120 19:14:31.749477 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:31 crc kubenswrapper[4773]: I0120 19:14:31.749811 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:31 crc kubenswrapper[4773]: I0120 19:14:31.796429 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:32 crc kubenswrapper[4773]: I0120 19:14:32.578686 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:32 crc kubenswrapper[4773]: I0120 19:14:32.864777 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:34 crc kubenswrapper[4773]: I0120 19:14:34.560227 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2n25g" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" containerID="cri-o://a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b" gracePeriod=2 Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.588384 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerID="a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b" exitCode=0 Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.588451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b"} Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.676684 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.813051 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.814224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.814281 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.814859 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities" (OuterVolumeSpecName: "utilities") pod "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" (UID: "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.832136 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6" (OuterVolumeSpecName: "kube-api-access-qfst6") pod "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" (UID: "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c"). InnerVolumeSpecName "kube-api-access-qfst6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.916623 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") on node \"crc\" DevicePath \"\"" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.916956 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.937388 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" (UID: "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.018196 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.598252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"fbf9aeb5648d8d46e58a5c7ca6f6106654eda90c176d1321e2b4db729aa24b85"} Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.598296 4773 scope.go:117] "RemoveContainer" containerID="a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.598400 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.635001 4773 scope.go:117] "RemoveContainer" containerID="1b51815bed658450c0ab9229613f8d80ebce730e02b6a3417d5216ee6bd4c737" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.639400 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.648143 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.658342 4773 scope.go:117] "RemoveContainer" containerID="a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4" Jan 20 19:14:39 crc kubenswrapper[4773]: I0120 19:14:39.457346 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" path="/var/lib/kubelet/pods/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c/volumes" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.306847 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:14:51 crc kubenswrapper[4773]: E0120 19:14:51.310824 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.310862 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" Jan 20 19:14:51 crc kubenswrapper[4773]: E0120 19:14:51.310888 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-content" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.310896 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-content" Jan 20 19:14:51 crc kubenswrapper[4773]: E0120 19:14:51.310924 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-utilities" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.310950 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-utilities" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.311142 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.312683 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.316994 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.453840 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.453978 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.454037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.556053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.556585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.557025 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.557227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.557686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.582812 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.636540 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.157181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.719681 4773 generic.go:334] "Generic (PLEG): container finished" podID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" exitCode=0 Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.719728 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2"} Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.719761 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerStarted","Data":"3a9c1c43a90ff7044ed955a46a9a095e67bfd81e8256db923dda0ab6848b0fbf"} Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.721763 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:14:53 crc kubenswrapper[4773]: I0120 19:14:53.728840 4773 generic.go:334] "Generic (PLEG): container finished" podID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" exitCode=0 Jan 20 19:14:53 crc kubenswrapper[4773]: I0120 19:14:53.729156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf"} Jan 20 19:14:54 crc kubenswrapper[4773]: I0120 19:14:54.738633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerStarted","Data":"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160"} Jan 20 19:14:54 crc kubenswrapper[4773]: I0120 19:14:54.768225 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jplxv" podStartSLOduration=2.297275949 podStartE2EDuration="3.768205395s" podCreationTimestamp="2026-01-20 19:14:51 +0000 UTC" firstStartedPulling="2026-01-20 19:14:52.721534647 +0000 UTC m=+2685.643347671" lastFinishedPulling="2026-01-20 19:14:54.192464083 +0000 UTC m=+2687.114277117" observedRunningTime="2026-01-20 19:14:54.765301305 +0000 UTC m=+2687.687114329" watchObservedRunningTime="2026-01-20 19:14:54.768205395 +0000 UTC m=+2687.690018419" Jan 20 19:14:58 crc kubenswrapper[4773]: I0120 19:14:58.170426 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:14:58 crc kubenswrapper[4773]: I0120 19:14:58.170982 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.159669 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k"] Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.165016 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.168167 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.168231 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.174708 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k"] Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.232736 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.232798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.232921 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.339835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.340226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.340328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.341741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.347716 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.357679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.489823 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.918876 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k"] Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.636744 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.637040 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.681420 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.801273 4773 generic.go:334] "Generic (PLEG): container finished" podID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerID="d2f5f6165c5bd6d2e78266b658c2c0568526cc830bd6d23c38285c251c623a30" exitCode=0 Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.801404 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" event={"ID":"f358c83b-d3f1-4f95-a02b-beecd73c1adb","Type":"ContainerDied","Data":"d2f5f6165c5bd6d2e78266b658c2c0568526cc830bd6d23c38285c251c623a30"} Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.801456 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" event={"ID":"f358c83b-d3f1-4f95-a02b-beecd73c1adb","Type":"ContainerStarted","Data":"e620b8d1f65e369a0b234717864d59f499d718603fcbce9a525bc94a325b2370"} Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.857400 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.913874 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.112969 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.195875 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.196048 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.196202 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.196796 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume" (OuterVolumeSpecName: "config-volume") pod "f358c83b-d3f1-4f95-a02b-beecd73c1adb" (UID: "f358c83b-d3f1-4f95-a02b-beecd73c1adb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.201503 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f358c83b-d3f1-4f95-a02b-beecd73c1adb" (UID: "f358c83b-d3f1-4f95-a02b-beecd73c1adb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.202000 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x" (OuterVolumeSpecName: "kube-api-access-mj25x") pod "f358c83b-d3f1-4f95-a02b-beecd73c1adb" (UID: "f358c83b-d3f1-4f95-a02b-beecd73c1adb"). InnerVolumeSpecName "kube-api-access-mj25x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.298265 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.298304 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.298316 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817598 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" event={"ID":"f358c83b-d3f1-4f95-a02b-beecd73c1adb","Type":"ContainerDied","Data":"e620b8d1f65e369a0b234717864d59f499d718603fcbce9a525bc94a325b2370"} Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817964 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e620b8d1f65e369a0b234717864d59f499d718603fcbce9a525bc94a325b2370" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817613 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817739 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jplxv" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" containerID="cri-o://b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" gracePeriod=2 Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.188852 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.196539 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.742365 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.825139 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"47e6cea6-90a9-46c1-8c9f-c36182604be7\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.825188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"47e6cea6-90a9-46c1-8c9f-c36182604be7\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.825243 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"47e6cea6-90a9-46c1-8c9f-c36182604be7\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.826376 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities" (OuterVolumeSpecName: "utilities") pod "47e6cea6-90a9-46c1-8c9f-c36182604be7" (UID: "47e6cea6-90a9-46c1-8c9f-c36182604be7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827693 4773 generic.go:334] "Generic (PLEG): container finished" podID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" exitCode=0 Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160"} Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827757 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"3a9c1c43a90ff7044ed955a46a9a095e67bfd81e8256db923dda0ab6848b0fbf"} Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827756 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827773 4773 scope.go:117] "RemoveContainer" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.832324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd" (OuterVolumeSpecName: "kube-api-access-tzhjd") pod "47e6cea6-90a9-46c1-8c9f-c36182604be7" (UID: "47e6cea6-90a9-46c1-8c9f-c36182604be7"). InnerVolumeSpecName "kube-api-access-tzhjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.848453 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47e6cea6-90a9-46c1-8c9f-c36182604be7" (UID: "47e6cea6-90a9-46c1-8c9f-c36182604be7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.882384 4773 scope.go:117] "RemoveContainer" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.902300 4773 scope.go:117] "RemoveContainer" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.927901 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.927976 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.927992 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.949016 4773 scope.go:117] "RemoveContainer" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" Jan 20 19:15:04 crc kubenswrapper[4773]: E0120 19:15:04.949541 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160\": container with ID starting with b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160 not found: ID does not exist" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.949635 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160"} err="failed to get container status \"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160\": rpc error: code = NotFound desc = could not find container \"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160\": container with ID starting with b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160 not found: ID does not exist" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.949677 4773 scope.go:117] "RemoveContainer" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" Jan 20 19:15:04 crc kubenswrapper[4773]: E0120 19:15:04.950272 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf\": container with ID starting with d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf not found: ID does not exist" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.950316 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf"} err="failed to get container status \"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf\": rpc error: code = NotFound desc = could not find container \"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf\": container with ID starting with d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf not found: ID does not exist" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.950345 4773 scope.go:117] "RemoveContainer" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" Jan 20 19:15:04 crc kubenswrapper[4773]: E0120 19:15:04.950904 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2\": container with ID starting with 895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2 not found: ID does not exist" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.950974 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2"} err="failed to get container status \"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2\": rpc error: code = NotFound desc = could not find container \"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2\": container with ID starting with 895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2 not found: ID does not exist" Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.165586 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.173848 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.460015 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" path="/var/lib/kubelet/pods/007a1e5a-0e90-44d1-b19d-e92154fb6a3d/volumes" Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.460773 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" path="/var/lib/kubelet/pods/47e6cea6-90a9-46c1-8c9f-c36182604be7/volumes" Jan 20 19:15:20 crc kubenswrapper[4773]: I0120 19:15:20.518427 4773 scope.go:117] "RemoveContainer" containerID="b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5" Jan 20 19:15:28 crc kubenswrapper[4773]: I0120 19:15:28.169707 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:15:28 crc kubenswrapper[4773]: I0120 19:15:28.170282 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.170110 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.171647 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.171758 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.172579 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.172712 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b" gracePeriod=600 Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.276988 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b" exitCode=0 Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.277065 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b"} Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.277357 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019"} Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.277385 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:17:26 crc kubenswrapper[4773]: I0120 19:17:26.557838 4773 generic.go:334] "Generic (PLEG): container finished" podID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerID="e53a39dda91cd4424e1be373936eb9332c7149e3a73c3d2efb4770fc0d55b04a" exitCode=0 Jan 20 19:17:26 crc kubenswrapper[4773]: I0120 19:17:26.557980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerDied","Data":"e53a39dda91cd4424e1be373936eb9332c7149e3a73c3d2efb4770fc0d55b04a"} Jan 20 19:17:27 crc kubenswrapper[4773]: I0120 19:17:27.951351 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091029 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091182 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091246 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091263 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.096660 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.097400 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph" (OuterVolumeSpecName: "ceph") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.104059 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v" (OuterVolumeSpecName: "kube-api-access-9mr5v") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "kube-api-access-9mr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.117271 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.126213 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.128100 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory" (OuterVolumeSpecName: "inventory") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.192970 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193007 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193017 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193025 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193034 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193045 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.577441 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerDied","Data":"4fb455accb7c642045b9830c42a1eef01ec071caf8bd86c446dbea2234b516db"} Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.577767 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fb455accb7c642045b9830c42a1eef01ec071caf8bd86c446dbea2234b516db" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.577540 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.661562 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d"] Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662005 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662030 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662055 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-content" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662063 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-content" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662077 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerName="collect-profiles" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662087 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerName="collect-profiles" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662103 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662113 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662134 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-utilities" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662141 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-utilities" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662346 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662390 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerName="collect-profiles" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662403 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.663136 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670053 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670375 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670608 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670760 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670974 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671162 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671403 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671563 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671708 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.674391 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d"] Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803233 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803278 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803325 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803348 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803634 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.804711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.804800 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906767 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906853 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906915 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906949 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906966 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906998 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.907019 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.907036 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.907061 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.908331 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.908609 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.911737 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.912808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.912900 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.914631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.915629 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.916123 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.916215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.916272 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.925398 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.983890 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:29 crc kubenswrapper[4773]: I0120 19:17:29.501562 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d"] Jan 20 19:17:29 crc kubenswrapper[4773]: I0120 19:17:29.586308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerStarted","Data":"53770674c2a213080a854c893c5fad1b4a28a090c6f7a3043311db0e2d863231"} Jan 20 19:17:30 crc kubenswrapper[4773]: I0120 19:17:30.596304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerStarted","Data":"e84ce546cf91a0456043033e683bd504d9b3778f3cc960ce6ad55eee146736f0"} Jan 20 19:17:30 crc kubenswrapper[4773]: I0120 19:17:30.625048 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" podStartSLOduration=2.111621515 podStartE2EDuration="2.625026049s" podCreationTimestamp="2026-01-20 19:17:28 +0000 UTC" firstStartedPulling="2026-01-20 19:17:29.514383977 +0000 UTC m=+2842.436197021" lastFinishedPulling="2026-01-20 19:17:30.027788531 +0000 UTC m=+2842.949601555" observedRunningTime="2026-01-20 19:17:30.618463541 +0000 UTC m=+2843.540276575" watchObservedRunningTime="2026-01-20 19:17:30.625026049 +0000 UTC m=+2843.546839073" Jan 20 19:17:58 crc kubenswrapper[4773]: I0120 19:17:58.170026 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:17:58 crc kubenswrapper[4773]: I0120 19:17:58.171269 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.860433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.863438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.869776 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.970505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.970902 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.971010 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073272 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.113192 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.189713 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.746163 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:23 crc kubenswrapper[4773]: I0120 19:18:23.024255 4773 generic.go:334] "Generic (PLEG): container finished" podID="2412fab9-ca39-492f-abb4-14bc806fe535" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" exitCode=0 Jan 20 19:18:23 crc kubenswrapper[4773]: I0120 19:18:23.024341 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f"} Jan 20 19:18:23 crc kubenswrapper[4773]: I0120 19:18:23.024600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerStarted","Data":"d62d5c25c7ac922af447559957ad4ea85cc0573cc46241785ca6c1bbb0f1c947"} Jan 20 19:18:24 crc kubenswrapper[4773]: I0120 19:18:24.033460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerStarted","Data":"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e"} Jan 20 19:18:25 crc kubenswrapper[4773]: I0120 19:18:25.043194 4773 generic.go:334] "Generic (PLEG): container finished" podID="2412fab9-ca39-492f-abb4-14bc806fe535" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" exitCode=0 Jan 20 19:18:25 crc kubenswrapper[4773]: I0120 19:18:25.044492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e"} Jan 20 19:18:26 crc kubenswrapper[4773]: I0120 19:18:26.052916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerStarted","Data":"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305"} Jan 20 19:18:26 crc kubenswrapper[4773]: I0120 19:18:26.074647 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q2df5" podStartSLOduration=2.572395366 podStartE2EDuration="5.074630596s" podCreationTimestamp="2026-01-20 19:18:21 +0000 UTC" firstStartedPulling="2026-01-20 19:18:23.027516475 +0000 UTC m=+2895.949329499" lastFinishedPulling="2026-01-20 19:18:25.529751705 +0000 UTC m=+2898.451564729" observedRunningTime="2026-01-20 19:18:26.071609913 +0000 UTC m=+2898.993422927" watchObservedRunningTime="2026-01-20 19:18:26.074630596 +0000 UTC m=+2898.996443620" Jan 20 19:18:28 crc kubenswrapper[4773]: I0120 19:18:28.170862 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:18:28 crc kubenswrapper[4773]: I0120 19:18:28.171541 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:18:32 crc kubenswrapper[4773]: I0120 19:18:32.190767 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:32 crc kubenswrapper[4773]: I0120 19:18:32.191230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:32 crc kubenswrapper[4773]: I0120 19:18:32.238011 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:33 crc kubenswrapper[4773]: I0120 19:18:33.154290 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:33 crc kubenswrapper[4773]: I0120 19:18:33.201074 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.122900 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q2df5" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" containerID="cri-o://1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" gracePeriod=2 Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.550979 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.749121 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"2412fab9-ca39-492f-abb4-14bc806fe535\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.749228 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"2412fab9-ca39-492f-abb4-14bc806fe535\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.749432 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"2412fab9-ca39-492f-abb4-14bc806fe535\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.750096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities" (OuterVolumeSpecName: "utilities") pod "2412fab9-ca39-492f-abb4-14bc806fe535" (UID: "2412fab9-ca39-492f-abb4-14bc806fe535"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.755244 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48" (OuterVolumeSpecName: "kube-api-access-tng48") pod "2412fab9-ca39-492f-abb4-14bc806fe535" (UID: "2412fab9-ca39-492f-abb4-14bc806fe535"). InnerVolumeSpecName "kube-api-access-tng48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.799593 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2412fab9-ca39-492f-abb4-14bc806fe535" (UID: "2412fab9-ca39-492f-abb4-14bc806fe535"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.851462 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") on node \"crc\" DevicePath \"\"" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.851501 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.851511 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.133710 4773 generic.go:334] "Generic (PLEG): container finished" podID="2412fab9-ca39-492f-abb4-14bc806fe535" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" exitCode=0 Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.133808 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.133813 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305"} Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.134249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"d62d5c25c7ac922af447559957ad4ea85cc0573cc46241785ca6c1bbb0f1c947"} Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.134269 4773 scope.go:117] "RemoveContainer" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.161176 4773 scope.go:117] "RemoveContainer" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.173111 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.182572 4773 scope.go:117] "RemoveContainer" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.186511 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.231587 4773 scope.go:117] "RemoveContainer" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" Jan 20 19:18:36 crc kubenswrapper[4773]: E0120 19:18:36.232093 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305\": container with ID starting with 1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305 not found: ID does not exist" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232132 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305"} err="failed to get container status \"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305\": rpc error: code = NotFound desc = could not find container \"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305\": container with ID starting with 1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305 not found: ID does not exist" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232161 4773 scope.go:117] "RemoveContainer" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" Jan 20 19:18:36 crc kubenswrapper[4773]: E0120 19:18:36.232645 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e\": container with ID starting with e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e not found: ID does not exist" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232728 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e"} err="failed to get container status \"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e\": rpc error: code = NotFound desc = could not find container \"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e\": container with ID starting with e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e not found: ID does not exist" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232786 4773 scope.go:117] "RemoveContainer" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" Jan 20 19:18:36 crc kubenswrapper[4773]: E0120 19:18:36.233345 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f\": container with ID starting with a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f not found: ID does not exist" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.233380 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f"} err="failed to get container status \"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f\": rpc error: code = NotFound desc = could not find container \"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f\": container with ID starting with a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f not found: ID does not exist" Jan 20 19:18:37 crc kubenswrapper[4773]: I0120 19:18:37.457812 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" path="/var/lib/kubelet/pods/2412fab9-ca39-492f-abb4-14bc806fe535/volumes" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.170237 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.170919 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.171022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.171893 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.171987 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" gracePeriod=600 Jan 20 19:18:58 crc kubenswrapper[4773]: E0120 19:18:58.293738 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.329351 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" exitCode=0 Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.329406 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019"} Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.329442 4773 scope.go:117] "RemoveContainer" containerID="4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.330232 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:18:58 crc kubenswrapper[4773]: E0120 19:18:58.330580 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:18:58 crc kubenswrapper[4773]: E0120 19:18:58.391137 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddd934f_f012_4083_b5e6_b99711071621.slice/crio-11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddd934f_f012_4083_b5e6_b99711071621.slice/crio-conmon-11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019.scope\": RecentStats: unable to find data in memory cache]" Jan 20 19:19:09 crc kubenswrapper[4773]: I0120 19:19:09.447545 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:09 crc kubenswrapper[4773]: E0120 19:19:09.448377 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:23 crc kubenswrapper[4773]: I0120 19:19:23.448555 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:23 crc kubenswrapper[4773]: E0120 19:19:23.449910 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:36 crc kubenswrapper[4773]: I0120 19:19:36.447544 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:36 crc kubenswrapper[4773]: E0120 19:19:36.448392 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:51 crc kubenswrapper[4773]: I0120 19:19:51.447492 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:51 crc kubenswrapper[4773]: E0120 19:19:51.448148 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:54 crc kubenswrapper[4773]: I0120 19:19:54.792843 4773 generic.go:334] "Generic (PLEG): container finished" podID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerID="e84ce546cf91a0456043033e683bd504d9b3778f3cc960ce6ad55eee146736f0" exitCode=0 Jan 20 19:19:54 crc kubenswrapper[4773]: I0120 19:19:54.792973 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerDied","Data":"e84ce546cf91a0456043033e683bd504d9b3778f3cc960ce6ad55eee146736f0"} Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.188582 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244023 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244215 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244250 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244283 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244360 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244415 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244438 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244475 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244490 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244532 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.250456 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9" (OuterVolumeSpecName: "kube-api-access-dsmd9") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "kube-api-access-dsmd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.252069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.272268 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph" (OuterVolumeSpecName: "ceph") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.275157 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.277039 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.277501 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.279097 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory" (OuterVolumeSpecName: "inventory") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.280245 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.283256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.285624 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.294577 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350227 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350583 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350597 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350617 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350628 4773 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350641 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350653 4773 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350663 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350673 4773 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350684 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350699 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.808191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerDied","Data":"53770674c2a213080a854c893c5fad1b4a28a090c6f7a3043311db0e2d863231"} Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.808226 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53770674c2a213080a854c893c5fad1b4a28a090c6f7a3043311db0e2d863231" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.808254 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:20:05 crc kubenswrapper[4773]: I0120 19:20:05.447681 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:05 crc kubenswrapper[4773]: E0120 19:20:05.448513 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.339634 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341069 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-content" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341091 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-content" Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341120 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-utilities" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341128 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-utilities" Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341141 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341149 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341165 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341174 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341384 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341414 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.342664 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.345109 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.352335 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.353414 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.355699 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.357846 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.368392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.388818 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-ceph\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436104 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436129 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436152 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436173 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436192 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktj2\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-kube-api-access-bktj2\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436262 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-sys\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-run\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bwc\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-kube-api-access-g9bwc\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436369 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-dev\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436383 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436414 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436438 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-run\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436472 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436493 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-lib-modules\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-scripts\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436548 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436569 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436620 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436677 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537742 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537798 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-run\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537852 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-lib-modules\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537875 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-scripts\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537944 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537961 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537980 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538057 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538072 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538094 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-ceph\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538128 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538155 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538203 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538220 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538256 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktj2\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-kube-api-access-bktj2\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538298 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-sys\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-run\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bwc\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-kube-api-access-g9bwc\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538407 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-dev\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539381 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-run\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539513 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-lib-modules\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540073 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540112 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540183 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540495 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540509 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540574 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540694 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540832 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-sys\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540890 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-run\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.541005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.541265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-dev\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.547667 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.548734 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.559981 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.572496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-ceph\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.578655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bwc\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-kube-api-access-g9bwc\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.581691 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktj2\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-kube-api-access-bktj2\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.582298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.582729 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-scripts\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.582866 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.583306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.583311 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.585081 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.672821 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.691869 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.071304 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.074434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082217 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082357 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vtkh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082459 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082848 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.087063 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.136452 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.138360 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.146486 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.146805 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.149023 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.157422 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.158564 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162713 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-logs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162816 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-ceph\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163000 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163068 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfg77\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-kube-api-access-bfg77\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163107 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163137 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.215407 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-scripts\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267445 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267494 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267564 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfg77\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-kube-api-access-bfg77\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267787 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267871 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267907 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnsdj\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-kube-api-access-fnsdj\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267968 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-logs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-config-data\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-logs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268126 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-ceph\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268161 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-ceph\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.269279 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.269774 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.272299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.272979 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-logs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.275505 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.279511 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.280468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.282913 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.284126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.284353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-ceph\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.288538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.289432 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.303080 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfg77\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-kube-api-access-bfg77\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371202 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnsdj\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-kube-api-access-fnsdj\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371386 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-config-data\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371421 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371442 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-logs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-ceph\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371548 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-scripts\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371620 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.372644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.373145 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-logs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.373153 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.379310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.383455 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.386453 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-scripts\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.387261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-ceph\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.387685 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.388534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.389127 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-config-data\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.390021 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnsdj\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-kube-api-access-fnsdj\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.391462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.410815 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.414167 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.471242 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.477039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.477204 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.478057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.487834 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.497557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.500496 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.512387 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.602732 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.653473 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.967977 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"466685e0-d49e-4d97-9436-7db7c10062c3","Type":"ContainerStarted","Data":"20ce3931ba17bb76a699fb39bbe5e116fea9b64e0577ff1ff4b45da69c5bb50e"} Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.969293 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4a053127-e129-429c-9a7b-28e084c34269","Type":"ContainerStarted","Data":"64cfe8b2895152a3a7522fd5882e79fbdc6059a51ba538930ce3eb42a4ed6b48"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.174630 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.199958 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.253720 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:20:13 crc kubenswrapper[4773]: W0120 19:20:13.296128 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80285eae_2998_47ab_bcd6_e9905e2e71d4.slice/crio-adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da WatchSource:0}: Error finding container adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da: Status 404 returned error can't find the container with id adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.985071 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"466685e0-d49e-4d97-9436-7db7c10062c3","Type":"ContainerStarted","Data":"d055112d14f2e629a8e001df39cd4874e98611d7c257b4025b57d54a9eee8e14"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.990744 4773 generic.go:334] "Generic (PLEG): container finished" podID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerID="0827af644398a715247b27083b551541f42dae2b0a3150620bd9104ee37e5138" exitCode=0 Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.990865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5vk9g" event={"ID":"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6","Type":"ContainerDied","Data":"0827af644398a715247b27083b551541f42dae2b0a3150620bd9104ee37e5138"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.990889 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5vk9g" event={"ID":"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6","Type":"ContainerStarted","Data":"b78ed55fd68aa09da51ca8d08d5d7237f7835a008ac40fd73221ce4e306eadee"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.999816 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"117c4f3b-d438-4f73-966c-378c28f67460","Type":"ContainerStarted","Data":"c695521bd759e3089142a9939da612b16228606f6c17b308e73ac5f6f71b1916"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.003018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4a053127-e129-429c-9a7b-28e084c34269","Type":"ContainerStarted","Data":"cd2494809157f0dc32b616d02961ab5a824cd251ab6ff4cf380d06eba77cf01b"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.011057 4773 generic.go:334] "Generic (PLEG): container finished" podID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerID="572888ce06611feefec987174ae3eee5c299fc9038199817efe0a94604e5aae9" exitCode=0 Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.011105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-c4ba-account-create-update-47gmh" event={"ID":"80285eae-2998-47ab-bcd6-e9905e2e71d4","Type":"ContainerDied","Data":"572888ce06611feefec987174ae3eee5c299fc9038199817efe0a94604e5aae9"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.011136 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-c4ba-account-create-update-47gmh" event={"ID":"80285eae-2998-47ab-bcd6-e9905e2e71d4","Type":"ContainerStarted","Data":"adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.104568 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 20 19:20:14 crc kubenswrapper[4773]: W0120 19:20:14.109475 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod696e3ee3_25fa_4102_b483_1781d00bb18f.slice/crio-82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251 WatchSource:0}: Error finding container 82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251: Status 404 returned error can't find the container with id 82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251 Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.032858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"117c4f3b-d438-4f73-966c-378c28f67460","Type":"ContainerStarted","Data":"19fbddbc9b86f769303d79312c2d030f23e2ed3bedd010b7e3f1f8d4734deb27"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.033662 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"117c4f3b-d438-4f73-966c-378c28f67460","Type":"ContainerStarted","Data":"d6094097fb5b6521d0f9eab177f854c1600b0455dcf9c4a951eef9840bbaa33d"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.038739 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4a053127-e129-429c-9a7b-28e084c34269","Type":"ContainerStarted","Data":"8c943ae399231c0185d23dd80b5c32282626c1d7c00d8630f9220b5d9186232c"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.047873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696e3ee3-25fa-4102-b483-1781d00bb18f","Type":"ContainerStarted","Data":"3d61afbbdb37be4e5d0f6699b89507fbdbcb2725381c635c03801cd69764d168"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.047959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696e3ee3-25fa-4102-b483-1781d00bb18f","Type":"ContainerStarted","Data":"82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.050796 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"466685e0-d49e-4d97-9436-7db7c10062c3","Type":"ContainerStarted","Data":"7920c95131e3c4f69ab4387125c62c826f431aab0589ad7e26f57c9b92057e19"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.086689 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.08666132 podStartE2EDuration="4.08666132s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:15.059584334 +0000 UTC m=+3007.981397378" watchObservedRunningTime="2026-01-20 19:20:15.08666132 +0000 UTC m=+3008.008474334" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.124584 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.149365459 podStartE2EDuration="4.124530739s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="2026-01-20 19:20:12.677349746 +0000 UTC m=+3005.599162770" lastFinishedPulling="2026-01-20 19:20:13.652515026 +0000 UTC m=+3006.574328050" observedRunningTime="2026-01-20 19:20:15.079716212 +0000 UTC m=+3008.001529256" watchObservedRunningTime="2026-01-20 19:20:15.124530739 +0000 UTC m=+3008.046343763" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.147485 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.299791759 podStartE2EDuration="4.147456995s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="2026-01-20 19:20:12.512091357 +0000 UTC m=+3005.433904381" lastFinishedPulling="2026-01-20 19:20:13.359756603 +0000 UTC m=+3006.281569617" observedRunningTime="2026-01-20 19:20:15.129556251 +0000 UTC m=+3008.051369275" watchObservedRunningTime="2026-01-20 19:20:15.147456995 +0000 UTC m=+3008.069270019" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.469427 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.476265 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.667979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"80285eae-2998-47ab-bcd6-e9905e2e71d4\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"80285eae-2998-47ab-bcd6-e9905e2e71d4\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668151 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668310 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668827 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" (UID: "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668827 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80285eae-2998-47ab-bcd6-e9905e2e71d4" (UID: "80285eae-2998-47ab-bcd6-e9905e2e71d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.670647 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.671137 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.674828 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt" (OuterVolumeSpecName: "kube-api-access-sfltt") pod "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" (UID: "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6"). InnerVolumeSpecName "kube-api-access-sfltt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.685241 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v" (OuterVolumeSpecName: "kube-api-access-lkg9v") pod "80285eae-2998-47ab-bcd6-e9905e2e71d4" (UID: "80285eae-2998-47ab-bcd6-e9905e2e71d4"). InnerVolumeSpecName "kube-api-access-lkg9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.773478 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.773520 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.060018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5vk9g" event={"ID":"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6","Type":"ContainerDied","Data":"b78ed55fd68aa09da51ca8d08d5d7237f7835a008ac40fd73221ce4e306eadee"} Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.060026 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.060890 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b78ed55fd68aa09da51ca8d08d5d7237f7835a008ac40fd73221ce4e306eadee" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.061864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696e3ee3-25fa-4102-b483-1781d00bb18f","Type":"ContainerStarted","Data":"f919d7cba009bbb481447339fe707497ba9577a667afaeccba3e1156016cc47c"} Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.065373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-c4ba-account-create-update-47gmh" event={"ID":"80285eae-2998-47ab-bcd6-e9905e2e71d4","Type":"ContainerDied","Data":"adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da"} Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.065470 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.066350 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.089651 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.089628323 podStartE2EDuration="5.089628323s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:16.085124094 +0000 UTC m=+3009.006937138" watchObservedRunningTime="2026-01-20 19:20:16.089628323 +0000 UTC m=+3009.011441348" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.446886 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:16 crc kubenswrapper[4773]: E0120 19:20:16.447453 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.740829 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.740953 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.625494 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:20:17 crc kubenswrapper[4773]: E0120 19:20:17.626434 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerName="mariadb-database-create" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626457 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerName="mariadb-database-create" Jan 20 19:20:17 crc kubenswrapper[4773]: E0120 19:20:17.626522 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerName="mariadb-account-create-update" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626536 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerName="mariadb-account-create-update" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626841 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerName="mariadb-account-create-update" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626876 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerName="mariadb-database-create" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.627801 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.629788 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-qgfd7" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.629838 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.642473 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812713 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.821335 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.830423 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.836740 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.840572 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.946579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:18 crc kubenswrapper[4773]: I0120 19:20:18.525213 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:20:19 crc kubenswrapper[4773]: I0120 19:20:19.091790 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerStarted","Data":"0539b6653066f272764ca28da046994c24db6fca84f59c015dd3f0893b721f4a"} Jan 20 19:20:21 crc kubenswrapper[4773]: I0120 19:20:21.885453 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 20 19:20:21 crc kubenswrapper[4773]: I0120 19:20:21.911569 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.412888 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.413294 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.444532 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.462715 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.490158 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.491960 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.527401 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.556050 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.132738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerStarted","Data":"4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241"} Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.133162 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.134203 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.134233 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.134244 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.175869 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-r2zvz" podStartSLOduration=2.283586094 podStartE2EDuration="6.175845676s" podCreationTimestamp="2026-01-20 19:20:17 +0000 UTC" firstStartedPulling="2026-01-20 19:20:18.527919281 +0000 UTC m=+3011.449732305" lastFinishedPulling="2026-01-20 19:20:22.420178863 +0000 UTC m=+3015.341991887" observedRunningTime="2026-01-20 19:20:23.162687707 +0000 UTC m=+3016.084500731" watchObservedRunningTime="2026-01-20 19:20:23.175845676 +0000 UTC m=+3016.097658700" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147157 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147461 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147241 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147502 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.215735 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.220257 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.238841 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.272982 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:29 crc kubenswrapper[4773]: I0120 19:20:29.446865 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:29 crc kubenswrapper[4773]: E0120 19:20:29.447710 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.508435 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.511413 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.521138 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.602956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.603341 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.603418 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.704864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.704911 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.705042 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.705384 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.705471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.732341 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.839128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:33 crc kubenswrapper[4773]: I0120 19:20:33.294416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.224076 4773 generic.go:334] "Generic (PLEG): container finished" podID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerID="4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241" exitCode=0 Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.224156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerDied","Data":"4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241"} Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.226627 4773 generic.go:334] "Generic (PLEG): container finished" podID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerID="0dcbbc8c658f47d7a18832e97e00c7168314f5b7519c1ce7c50ec20d81f74854" exitCode=0 Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.226661 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"0dcbbc8c658f47d7a18832e97e00c7168314f5b7519c1ce7c50ec20d81f74854"} Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.226685 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerStarted","Data":"310cc51704b18b2aa90c2ac829fa16af6930efbf07c19a5e76754a3192b7d35b"} Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.237154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerStarted","Data":"4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c"} Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.637221 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.767920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.768163 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.769041 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.769315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.782105 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data" (OuterVolumeSpecName: "config-data") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.782379 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9" (OuterVolumeSpecName: "kube-api-access-95wx9") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "kube-api-access-95wx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.783393 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.806984 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872357 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872390 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872400 4773 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872420 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.247819 4773 generic.go:334] "Generic (PLEG): container finished" podID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerID="4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c" exitCode=0 Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.247873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c"} Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.249409 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerDied","Data":"0539b6653066f272764ca28da046994c24db6fca84f59c015dd3f0893b721f4a"} Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.249457 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0539b6653066f272764ca28da046994c24db6fca84f59c015dd3f0893b721f4a" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.249427 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.614094 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: E0120 19:20:36.614767 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerName="manila-db-sync" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.614783 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerName="manila-db-sync" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.614999 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerName="manila-db-sync" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.616051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.634463 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.634674 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.634782 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.635626 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-qgfd7" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.646572 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.648407 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.654965 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688671 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688687 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688709 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688730 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688791 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688900 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688920 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.689252 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.707722 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.747236 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-d8qrf"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.748877 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796037 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-d8qrf"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796846 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796923 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796966 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796983 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797060 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797078 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwlkd\" (UniqueName: \"kubernetes.io/projected/3b08b301-686b-45e6-9903-5df8a754a16a-kube-api-access-wwlkd\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797173 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-config\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.813538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.813604 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.819984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.820163 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.820707 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.821084 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.837471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.839790 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.841730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.852009 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.854574 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.860761 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.865376 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.882484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900261 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-config\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwlkd\" (UniqueName: \"kubernetes.io/projected/3b08b301-686b-45e6-9903-5df8a754a16a-kube-api-access-wwlkd\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900521 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.901957 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.904116 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.904127 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-config\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.907852 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.912068 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.913669 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.917397 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.923537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.931901 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.949898 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.960466 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwlkd\" (UniqueName: \"kubernetes.io/projected/3b08b301-686b-45e6-9903-5df8a754a16a-kube-api-access-wwlkd\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.998415 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001756 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001856 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001912 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001947 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001965 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.096191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103392 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103454 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103480 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103706 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.105871 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.109520 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.112327 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.113658 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.115677 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.116547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.125246 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.266359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerStarted","Data":"b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c"} Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.412679 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.649440 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6rbwt" podStartSLOduration=3.044691933 podStartE2EDuration="5.649418858s" podCreationTimestamp="2026-01-20 19:20:32 +0000 UTC" firstStartedPulling="2026-01-20 19:20:34.228415268 +0000 UTC m=+3027.150228292" lastFinishedPulling="2026-01-20 19:20:36.833142193 +0000 UTC m=+3029.754955217" observedRunningTime="2026-01-20 19:20:37.307610864 +0000 UTC m=+3030.229423888" watchObservedRunningTime="2026-01-20 19:20:37.649418858 +0000 UTC m=+3030.571231882" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.649775 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.677211 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.769629 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-d8qrf"] Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.058410 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.287493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerStarted","Data":"e3d14db8a2dd3006ff781195948b0be8cc655d85ea68cd28813883eb46dead06"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.294105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerStarted","Data":"bfadeca77795b07ca8716eb0dc10af286da6021714d107d79cc84f7669939957"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.295552 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerStarted","Data":"e678d9c3a7cacb6b1935cbb52e080eed45938d042984c3bc8838c3bad5e5d7f5"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.298125 4773 generic.go:334] "Generic (PLEG): container finished" podID="3b08b301-686b-45e6-9903-5df8a754a16a" containerID="2684a6d31019468fc8fe4bf2534508370b31e404ba23f254ba4e185118fd0f68" exitCode=0 Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.298215 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" event={"ID":"3b08b301-686b-45e6-9903-5df8a754a16a","Type":"ContainerDied","Data":"2684a6d31019468fc8fe4bf2534508370b31e404ba23f254ba4e185118fd0f68"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.298266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" event={"ID":"3b08b301-686b-45e6-9903-5df8a754a16a","Type":"ContainerStarted","Data":"a09f039bcd20d122d0d482943b9f6fb2ffb12284b2f35c88fb90e38ec1eb6193"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.325515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" event={"ID":"3b08b301-686b-45e6-9903-5df8a754a16a","Type":"ContainerStarted","Data":"ef515e1fbbe927a18099f0424c7979916924865c785a079cedb358635604e9ee"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.326169 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.331492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerStarted","Data":"e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.334725 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerStarted","Data":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.334777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerStarted","Data":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.335754 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.362163 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" podStartSLOduration=3.362142821 podStartE2EDuration="3.362142821s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:39.357845766 +0000 UTC m=+3032.279658810" watchObservedRunningTime="2026-01-20 19:20:39.362142821 +0000 UTC m=+3032.283955845" Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.385789 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.385768784 podStartE2EDuration="3.385768784s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:39.378291112 +0000 UTC m=+3032.300104136" watchObservedRunningTime="2026-01-20 19:20:39.385768784 +0000 UTC m=+3032.307581808" Jan 20 19:20:40 crc kubenswrapper[4773]: I0120 19:20:40.074415 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.357359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerStarted","Data":"a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e"} Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.357584 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" containerID="cri-o://e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" gracePeriod=30 Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.357828 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" containerID="cri-o://e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" gracePeriod=30 Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.389399 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.580657595 podStartE2EDuration="5.389377885s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="2026-01-20 19:20:37.683296389 +0000 UTC m=+3030.605109413" lastFinishedPulling="2026-01-20 19:20:38.492016679 +0000 UTC m=+3031.413829703" observedRunningTime="2026-01-20 19:20:41.381863042 +0000 UTC m=+3034.303676086" watchObservedRunningTime="2026-01-20 19:20:41.389377885 +0000 UTC m=+3034.311190909" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.128526 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.255889 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.255968 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256064 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256099 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256172 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256195 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256277 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256669 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.257197 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.257459 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs" (OuterVolumeSpecName: "logs") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.262207 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.265064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt" (OuterVolumeSpecName: "kube-api-access-p6ljt") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "kube-api-access-p6ljt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.284087 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts" (OuterVolumeSpecName: "scripts") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.289195 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.319456 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data" (OuterVolumeSpecName: "config-data") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359644 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359690 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359703 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359713 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359727 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359737 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.378230 4773 generic.go:334] "Generic (PLEG): container finished" podID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" exitCode=0 Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.378267 4773 generic.go:334] "Generic (PLEG): container finished" podID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" exitCode=143 Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379446 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerDied","Data":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379611 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerDied","Data":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379647 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerDied","Data":"bfadeca77795b07ca8716eb0dc10af286da6021714d107d79cc84f7669939957"} Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379667 4773 scope.go:117] "RemoveContainer" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.411495 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.429851 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.443227 4773 scope.go:117] "RemoveContainer" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.443972 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: E0120 19:20:42.444387 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444407 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" Jan 20 19:20:42 crc kubenswrapper[4773]: E0120 19:20:42.444423 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444431 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444651 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444672 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.445711 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.448213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.448448 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.448610 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.463849 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.562719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563120 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wz4\" (UniqueName: \"kubernetes.io/projected/6e180830-62c9-4473-9d6b-197fbe92af49-kube-api-access-z9wz4\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-scripts\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563464 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e180830-62c9-4473-9d6b-197fbe92af49-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563524 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e180830-62c9-4473-9d6b-197fbe92af49-logs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data-custom\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-public-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666481 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data-custom\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666518 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-public-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666620 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wz4\" (UniqueName: \"kubernetes.io/projected/6e180830-62c9-4473-9d6b-197fbe92af49-kube-api-access-z9wz4\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-scripts\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e180830-62c9-4473-9d6b-197fbe92af49-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666770 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e180830-62c9-4473-9d6b-197fbe92af49-logs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.667198 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e180830-62c9-4473-9d6b-197fbe92af49-logs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.670793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e180830-62c9-4473-9d6b-197fbe92af49-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.672637 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data-custom\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.673021 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.674516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-public-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.675087 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.678108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-scripts\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.687399 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.691145 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wz4\" (UniqueName: \"kubernetes.io/projected/6e180830-62c9-4473-9d6b-197fbe92af49-kube-api-access-z9wz4\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.767704 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.842564 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.842683 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.905038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.061843 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062470 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" containerID="cri-o://f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062579 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" containerID="cri-o://6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062638 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" containerID="cri-o://52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062162 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" containerID="cri-o://37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393255 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6" exitCode=0 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393666 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0" exitCode=2 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393712 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6"} Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393801 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0"} Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.447538 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:43 crc kubenswrapper[4773]: E0120 19:20:43.447854 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.466300 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" path="/var/lib/kubelet/pods/51f938c6-86eb-414f-b3c7-47f8d4c1927d/volumes" Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.466974 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:44 crc kubenswrapper[4773]: I0120 19:20:44.430521 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80" exitCode=0 Jan 20 19:20:44 crc kubenswrapper[4773]: I0120 19:20:44.430561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80"} Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.099844 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.438794 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6rbwt" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" containerID="cri-o://b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c" gracePeriod=2 Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.785406 4773 scope.go:117] "RemoveContainer" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:45 crc kubenswrapper[4773]: E0120 19:20:45.786119 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": container with ID starting with e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74 not found: ID does not exist" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786151 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} err="failed to get container status \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": rpc error: code = NotFound desc = could not find container \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": container with ID starting with e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74 not found: ID does not exist" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786175 4773 scope.go:117] "RemoveContainer" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:45 crc kubenswrapper[4773]: E0120 19:20:45.786405 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": container with ID starting with e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa not found: ID does not exist" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786433 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} err="failed to get container status \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": rpc error: code = NotFound desc = could not find container \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": container with ID starting with e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa not found: ID does not exist" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786446 4773 scope.go:117] "RemoveContainer" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786725 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} err="failed to get container status \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": rpc error: code = NotFound desc = could not find container \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": container with ID starting with e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74 not found: ID does not exist" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786773 4773 scope.go:117] "RemoveContainer" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.790559 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} err="failed to get container status \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": rpc error: code = NotFound desc = could not find container \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": container with ID starting with e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa not found: ID does not exist" Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.449007 4773 generic.go:334] "Generic (PLEG): container finished" podID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerID="b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c" exitCode=0 Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.449105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c"} Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.477700 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.952913 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:46.999366 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.065017 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"b74b1a0f-c97d-4491-94a5-4429140cd990\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.065133 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"b74b1a0f-c97d-4491-94a5-4429140cd990\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.065153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"b74b1a0f-c97d-4491-94a5-4429140cd990\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.067062 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities" (OuterVolumeSpecName: "utilities") pod "b74b1a0f-c97d-4491-94a5-4429140cd990" (UID: "b74b1a0f-c97d-4491-94a5-4429140cd990"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.087071 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz" (OuterVolumeSpecName: "kube-api-access-rnlgz") pod "b74b1a0f-c97d-4491-94a5-4429140cd990" (UID: "b74b1a0f-c97d-4491-94a5-4429140cd990"). InnerVolumeSpecName "kube-api-access-rnlgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.098359 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.136147 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b74b1a0f-c97d-4491-94a5-4429140cd990" (UID: "b74b1a0f-c97d-4491-94a5-4429140cd990"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.169562 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.169604 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.169617 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.184946 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.185172 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" containerID="cri-o://0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480" gracePeriod=10 Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.507509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"310cc51704b18b2aa90c2ac829fa16af6930efbf07c19a5e76754a3192b7d35b"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.509306 4773 scope.go:117] "RemoveContainer" containerID="b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.508190 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.522225 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerID="0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480" exitCode=0 Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.524339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerDied","Data":"0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.550382 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305" exitCode=0 Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.550459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.563528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e180830-62c9-4473-9d6b-197fbe92af49","Type":"ContainerStarted","Data":"752f309be630a7024a48e3c7c1ac9dee313ddeae32bc1565e6b3ce4581064236"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.563844 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e180830-62c9-4473-9d6b-197fbe92af49","Type":"ContainerStarted","Data":"c6753e6bfb6aac80cba0285abf73928e3b7f36843312135f77bb09ae13cf0264"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.564305 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.572072 4773 scope.go:117] "RemoveContainer" containerID="4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.603391 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.617180 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.631269 4773 scope.go:117] "RemoveContainer" containerID="0dcbbc8c658f47d7a18832e97e00c7168314f5b7519c1ce7c50ec20d81f74854" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698208 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698447 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698511 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698564 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.702109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.708641 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.720515 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts" (OuterVolumeSpecName: "scripts") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.721153 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b" (OuterVolumeSpecName: "kube-api-access-qhs6b") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "kube-api-access-qhs6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.739356 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800508 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800538 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800547 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800555 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800565 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.829360 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.881342 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.903223 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.903254 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.943126 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data" (OuterVolumeSpecName: "config-data") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.005209 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.008029 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.106998 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107183 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107300 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107365 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.133901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd" (OuterVolumeSpecName: "kube-api-access-m92nd") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "kube-api-access-m92nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.181495 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.187839 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config" (OuterVolumeSpecName: "config") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.210017 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.210047 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.210059 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.211622 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.215163 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.234185 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.312493 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.312528 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.312539 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.576176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"69446a8d0d2a42de6e148590cfbcb0a1f5f08dfbfef8edbc94698b1b5257bf49"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.576543 4773 scope.go:117] "RemoveContainer" containerID="6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.576199 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.578583 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e180830-62c9-4473-9d6b-197fbe92af49","Type":"ContainerStarted","Data":"e792910914167205c550decfabf0eb54ec47edaa41002fc235e15f1e3092cfe1"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.578782 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.584166 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerStarted","Data":"5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.584234 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerStarted","Data":"14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.599924 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerDied","Data":"22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.600041 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.609076 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.609050824 podStartE2EDuration="6.609050824s" podCreationTimestamp="2026-01-20 19:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:48.604663158 +0000 UTC m=+3041.526476212" watchObservedRunningTime="2026-01-20 19:20:48.609050824 +0000 UTC m=+3041.530863848" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.633822 4773 scope.go:117] "RemoveContainer" containerID="f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.650941 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.404382468 podStartE2EDuration="12.65090948s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="2026-01-20 19:20:37.659294777 +0000 UTC m=+3030.581107801" lastFinishedPulling="2026-01-20 19:20:45.905821789 +0000 UTC m=+3038.827634813" observedRunningTime="2026-01-20 19:20:48.63980197 +0000 UTC m=+3041.561614994" watchObservedRunningTime="2026-01-20 19:20:48.65090948 +0000 UTC m=+3041.572722504" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.677173 4773 scope.go:117] "RemoveContainer" containerID="52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.705119 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.722466 4773 scope.go:117] "RemoveContainer" containerID="37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.724227 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.732683 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.742966 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.748674 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749284 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749345 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749402 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749461 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="init" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749510 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="init" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749565 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-utilities" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749619 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-utilities" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749685 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749740 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749798 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-content" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749867 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-content" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749944 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749998 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.750070 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750125 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.750183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750234 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750447 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750529 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750596 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750653 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750707 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750770 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.752531 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.756759 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.757882 4773 scope.go:117] "RemoveContainer" containerID="0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.758169 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.761070 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.761663 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.797648 4773 scope.go:117] "RemoveContainer" containerID="f7aec563576030ac1c13e7cfb223eea2c4098b2ad34114c7a7b21eb120d4d273" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.827928 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828006 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828345 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828416 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930187 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930495 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.931131 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.931424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.936139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.936516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.936980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.937980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.947945 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.952527 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.078661 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.460727 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020e7117-149f-4a0d-aa81-a324df9db850" path="/var/lib/kubelet/pods/020e7117-149f-4a0d-aa81-a324df9db850/volumes" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.462164 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" path="/var/lib/kubelet/pods/b74b1a0f-c97d-4491-94a5-4429140cd990/volumes" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.463477 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" path="/var/lib/kubelet/pods/ba4ab073-f712-41fb-9b44-d83a19b72973/volumes" Jan 20 19:20:49 crc kubenswrapper[4773]: W0120 19:20:49.567834 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a597841_2c16_4b79_8e39_a24ff2d90b49.slice/crio-6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1 WatchSource:0}: Error finding container 6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1: Status 404 returned error can't find the container with id 6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1 Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.570389 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.610597 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1"} Jan 20 19:20:50 crc kubenswrapper[4773]: I0120 19:20:50.073802 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:50 crc kubenswrapper[4773]: I0120 19:20:50.620137 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7"} Jan 20 19:20:51 crc kubenswrapper[4773]: I0120 19:20:51.629589 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa"} Jan 20 19:20:52 crc kubenswrapper[4773]: I0120 19:20:52.640579 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2"} Jan 20 19:20:54 crc kubenswrapper[4773]: I0120 19:20:54.447258 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:54 crc kubenswrapper[4773]: E0120 19:20:54.447757 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:56 crc kubenswrapper[4773]: I0120 19:20:56.950828 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.489957 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.549357 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.695950 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" containerID="cri-o://a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e" gracePeriod=30 Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.695815 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" containerID="cri-o://e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65" gracePeriod=30 Jan 20 19:20:59 crc kubenswrapper[4773]: I0120 19:20:59.706882 4773 generic.go:334] "Generic (PLEG): container finished" podID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerID="a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e" exitCode=0 Jan 20 19:20:59 crc kubenswrapper[4773]: I0120 19:20:59.706963 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerDied","Data":"a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e"} Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734346 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5"} Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734970 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734483 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" containerID="cri-o://44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734611 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" containerID="cri-o://747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734627 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" containerID="cri-o://d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734598 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" containerID="cri-o://19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.769308 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.507948975 podStartE2EDuration="14.769284443s" podCreationTimestamp="2026-01-20 19:20:48 +0000 UTC" firstStartedPulling="2026-01-20 19:20:49.570529041 +0000 UTC m=+3042.492342065" lastFinishedPulling="2026-01-20 19:21:01.831864509 +0000 UTC m=+3054.753677533" observedRunningTime="2026-01-20 19:21:02.762476978 +0000 UTC m=+3055.684290002" watchObservedRunningTime="2026-01-20 19:21:02.769284443 +0000 UTC m=+3055.691097477" Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.753634 4773 generic.go:334] "Generic (PLEG): container finished" podID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerID="e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.753724 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerDied","Data":"e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757107 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757126 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2" exitCode=2 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757134 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757141 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757175 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757185 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757194 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.994210 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.000237 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.165649 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167118 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167353 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167469 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167955 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168006 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168149 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168183 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168234 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168260 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168327 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168377 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168498 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.171245 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.172174 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.172417 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts" (OuterVolumeSpecName: "scripts") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.173363 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.173642 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7" (OuterVolumeSpecName: "kube-api-access-zs9t7") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "kube-api-access-zs9t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.176113 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk" (OuterVolumeSpecName: "kube-api-access-ckqfk") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "kube-api-access-ckqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.186605 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts" (OuterVolumeSpecName: "scripts") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.202983 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.228076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273063 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273092 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273100 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273110 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273118 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273126 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273134 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273142 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273150 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.281374 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.293791 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.309558 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.331350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data" (OuterVolumeSpecName: "config-data") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.347117 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data" (OuterVolumeSpecName: "config-data") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375135 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375173 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375183 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375190 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375199 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.765600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerDied","Data":"e3d14db8a2dd3006ff781195948b0be8cc655d85ea68cd28813883eb46dead06"} Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.765623 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.765657 4773 scope.go:117] "RemoveContainer" containerID="a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.769517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1"} Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.769600 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.802827 4773 scope.go:117] "RemoveContainer" containerID="e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.816514 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.827759 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.835891 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.845099 4773 scope.go:117] "RemoveContainer" containerID="d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.851286 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.862992 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863501 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863516 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863527 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863533 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863553 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863559 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863572 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863578 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863598 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863605 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863628 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863634 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863833 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863846 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863862 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863875 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863886 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863893 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.865149 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.876907 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.885436 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.889667 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.894615 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901205 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901465 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901599 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901775 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.980011 4773 scope.go:117] "RemoveContainer" containerID="19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990535 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfd9s\" (UniqueName: \"kubernetes.io/projected/6165543b-5cc4-4a1c-bbba-ed4621838073-kube-api-access-dfd9s\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990601 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-config-data\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990680 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmk5\" (UniqueName: \"kubernetes.io/projected/e42acb5b-abbc-4f06-918d-2e886b50146e-kube-api-access-5cmk5\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990716 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990741 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-scripts\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990759 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e42acb5b-abbc-4f06-918d-2e886b50146e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-scripts\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990809 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-run-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-log-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990869 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.009702 4773 scope.go:117] "RemoveContainer" containerID="747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.031874 4773 scope.go:117] "RemoveContainer" containerID="44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-log-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093143 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093275 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfd9s\" (UniqueName: \"kubernetes.io/projected/6165543b-5cc4-4a1c-bbba-ed4621838073-kube-api-access-dfd9s\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093355 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-config-data\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmk5\" (UniqueName: \"kubernetes.io/projected/e42acb5b-abbc-4f06-918d-2e886b50146e-kube-api-access-5cmk5\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093458 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093489 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-scripts\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093515 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e42acb5b-abbc-4f06-918d-2e886b50146e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-scripts\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093568 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-run-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.094177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-run-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.095402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-log-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.095554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e42acb5b-abbc-4f06-918d-2e886b50146e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.099806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.100018 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.100484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.100592 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-config-data\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.101162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.101553 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.102044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.109503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-scripts\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.110328 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-scripts\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.113389 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmk5\" (UniqueName: \"kubernetes.io/projected/e42acb5b-abbc-4f06-918d-2e886b50146e-kube-api-access-5cmk5\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.114275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfd9s\" (UniqueName: \"kubernetes.io/projected/6165543b-5cc4-4a1c-bbba-ed4621838073-kube-api-access-dfd9s\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.279194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.291211 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.474520 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" path="/var/lib/kubelet/pods/1a597841-2c16-4b79-8e39-a24ff2d90b49/volumes" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.475510 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" path="/var/lib/kubelet/pods/8ac3cbb7-870d-49e0-b7f2-0996320eeea8/volumes" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.734156 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:05 crc kubenswrapper[4773]: W0120 19:21:05.745419 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42acb5b_abbc_4f06_918d_2e886b50146e.slice/crio-c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b WatchSource:0}: Error finding container c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b: Status 404 returned error can't find the container with id c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.747258 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:05 crc kubenswrapper[4773]: W0120 19:21:05.753357 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6165543b_5cc4_4a1c_bbba_ed4621838073.slice/crio-ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f WatchSource:0}: Error finding container ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f: Status 404 returned error can't find the container with id ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.779989 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f"} Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.782459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e42acb5b-abbc-4f06-918d-2e886b50146e","Type":"ContainerStarted","Data":"c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.791922 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e42acb5b-abbc-4f06-918d-2e886b50146e","Type":"ContainerStarted","Data":"8a0e690c09f50115173a74d7527e874019872d7d632b6061b6ba2960a4b3952d"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.792533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e42acb5b-abbc-4f06-918d-2e886b50146e","Type":"ContainerStarted","Data":"17b2ab04a9549b2f0546229ffc85d19d8af608a7b04abc6aa2309dd798a369fe"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.794541 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"083383d1547e587f0b074637cbb9d6d0feba6bb4c700d5fa54a80b4301db7e7d"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.828664 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.8286288490000002 podStartE2EDuration="2.828628849s" podCreationTimestamp="2026-01-20 19:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:21:06.814887175 +0000 UTC m=+3059.736700199" watchObservedRunningTime="2026-01-20 19:21:06.828628849 +0000 UTC m=+3059.750441873" Jan 20 19:21:07 crc kubenswrapper[4773]: I0120 19:21:07.454973 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:07 crc kubenswrapper[4773]: E0120 19:21:07.456632 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:21:07 crc kubenswrapper[4773]: I0120 19:21:07.814179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"e4077bc7c096866d5e71987f9c937b260b3efdc756e499d52886e18fefed0f74"} Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.805452 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.830910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"8630a3365c6d2acfb95c5cab91c85213bdc8aab39e00b97c939992e97f5d8c38"} Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.869951 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.870404 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" containerID="cri-o://5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2" gracePeriod=30 Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.870302 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" containerID="cri-o://14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b" gracePeriod=30 Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.854983 4773 generic.go:334] "Generic (PLEG): container finished" podID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerID="5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2" exitCode=0 Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.855396 4773 generic.go:334] "Generic (PLEG): container finished" podID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerID="14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b" exitCode=1 Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.855060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerDied","Data":"5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2"} Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.855454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerDied","Data":"14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b"} Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.105655 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.199030 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.199079 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.199123 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200410 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200582 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200615 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200756 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.201084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.201500 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.201526 4773 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.204560 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph" (OuterVolumeSpecName: "ceph") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.204626 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc" (OuterVolumeSpecName: "kube-api-access-gnctc") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "kube-api-access-gnctc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.205513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.209721 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts" (OuterVolumeSpecName: "scripts") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.264482 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303033 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data" (OuterVolumeSpecName: "config-data") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303054 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303133 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303149 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303164 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303175 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.405604 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.865145 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.865133 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerDied","Data":"e678d9c3a7cacb6b1935cbb52e080eed45938d042984c3bc8838c3bad5e5d7f5"} Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.865532 4773 scope.go:117] "RemoveContainer" containerID="5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.867676 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"d6f3c97bb51378b787db3a4d41057fb99d5e4aed95bfb5b21d7e5d45a727e697"} Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.867836 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.890215 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.736253908 podStartE2EDuration="6.890193439s" podCreationTimestamp="2026-01-20 19:21:04 +0000 UTC" firstStartedPulling="2026-01-20 19:21:05.755599005 +0000 UTC m=+3058.677412029" lastFinishedPulling="2026-01-20 19:21:09.909538536 +0000 UTC m=+3062.831351560" observedRunningTime="2026-01-20 19:21:10.889376668 +0000 UTC m=+3063.811189712" watchObservedRunningTime="2026-01-20 19:21:10.890193439 +0000 UTC m=+3063.812006463" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.893059 4773 scope.go:117] "RemoveContainer" containerID="14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.915165 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.924149 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.943753 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:10 crc kubenswrapper[4773]: E0120 19:21:10.944244 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945076 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" Jan 20 19:21:10 crc kubenswrapper[4773]: E0120 19:21:10.945120 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945146 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945388 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945428 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.946736 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.952355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.975452 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdn68\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-kube-api-access-fdn68\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-scripts\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013888 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014063 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-ceph\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014383 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116186 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-ceph\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116290 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116377 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdn68\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-kube-api-access-fdn68\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116495 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116500 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-scripts\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116717 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-scripts\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120748 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-ceph\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120767 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120944 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.121448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.137484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdn68\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-kube-api-access-fdn68\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.270464 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.464871 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" path="/var/lib/kubelet/pods/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d/volumes" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.938329 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:11 crc kubenswrapper[4773]: W0120 19:21:11.939241 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7af9581_1520_466a_8b8f_1b957274273e.slice/crio-30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859 WatchSource:0}: Error finding container 30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859: Status 404 returned error can't find the container with id 30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859 Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.885784 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a7af9581-1520-466a-8b8f-1b957274273e","Type":"ContainerStarted","Data":"0acff6ef5aff885d7560993655ab871f967d11c930e2fdb7b062c52768a6a2bb"} Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.886338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a7af9581-1520-466a-8b8f-1b957274273e","Type":"ContainerStarted","Data":"b611c12cc3bce8f76792dd71b28081d98ac51bbfa4e690ca1057e86ed0cce8f4"} Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.886350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a7af9581-1520-466a-8b8f-1b957274273e","Type":"ContainerStarted","Data":"30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859"} Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.911330 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.911312954 podStartE2EDuration="2.911312954s" podCreationTimestamp="2026-01-20 19:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:21:12.906700952 +0000 UTC m=+3065.828513976" watchObservedRunningTime="2026-01-20 19:21:12.911312954 +0000 UTC m=+3065.833125978" Jan 20 19:21:15 crc kubenswrapper[4773]: I0120 19:21:15.280044 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 20 19:21:21 crc kubenswrapper[4773]: I0120 19:21:21.270849 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 20 19:21:21 crc kubenswrapper[4773]: I0120 19:21:21.447038 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:21 crc kubenswrapper[4773]: E0120 19:21:21.447452 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:21:26 crc kubenswrapper[4773]: I0120 19:21:26.837706 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 20 19:21:32 crc kubenswrapper[4773]: I0120 19:21:32.793620 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 20 19:21:35 crc kubenswrapper[4773]: I0120 19:21:35.299231 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 19:21:36 crc kubenswrapper[4773]: I0120 19:21:36.447206 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:36 crc kubenswrapper[4773]: E0120 19:21:36.447778 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:21:48 crc kubenswrapper[4773]: I0120 19:21:48.447238 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:48 crc kubenswrapper[4773]: E0120 19:21:48.448098 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:02 crc kubenswrapper[4773]: I0120 19:22:02.447175 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:02 crc kubenswrapper[4773]: E0120 19:22:02.448130 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:13 crc kubenswrapper[4773]: I0120 19:22:13.448296 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:13 crc kubenswrapper[4773]: E0120 19:22:13.449296 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:13 crc kubenswrapper[4773]: E0120 19:22:13.739396 4773 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:33062->38.102.83.39:34695: write tcp 38.102.83.39:33062->38.102.83.39:34695: write: broken pipe Jan 20 19:22:24 crc kubenswrapper[4773]: I0120 19:22:24.447494 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:24 crc kubenswrapper[4773]: E0120 19:22:24.448493 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:38 crc kubenswrapper[4773]: I0120 19:22:38.448823 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:38 crc kubenswrapper[4773]: E0120 19:22:38.449529 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:52 crc kubenswrapper[4773]: I0120 19:22:52.447346 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:52 crc kubenswrapper[4773]: E0120 19:22:52.448128 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:03 crc kubenswrapper[4773]: I0120 19:23:03.449179 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:03 crc kubenswrapper[4773]: E0120 19:23:03.449980 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:16 crc kubenswrapper[4773]: I0120 19:23:16.447480 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:16 crc kubenswrapper[4773]: E0120 19:23:16.448174 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:30 crc kubenswrapper[4773]: I0120 19:23:30.447433 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:30 crc kubenswrapper[4773]: E0120 19:23:30.448640 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:44 crc kubenswrapper[4773]: I0120 19:23:44.447581 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:44 crc kubenswrapper[4773]: E0120 19:23:44.448397 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:56 crc kubenswrapper[4773]: I0120 19:23:56.447005 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:56 crc kubenswrapper[4773]: E0120 19:23:56.447993 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:24:11 crc kubenswrapper[4773]: I0120 19:24:11.448377 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:24:12 crc kubenswrapper[4773]: I0120 19:24:12.449634 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02"} Jan 20 19:25:06 crc kubenswrapper[4773]: I0120 19:25:06.919890 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:06 crc kubenswrapper[4773]: I0120 19:25:06.929235 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:06 crc kubenswrapper[4773]: I0120 19:25:06.936078 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.034032 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.034082 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.034132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137481 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137769 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.138290 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.158951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.267028 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.798429 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.937945 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerStarted","Data":"885ff945d26cc2c0aa231b3527368210271a137dbd07e6bd6c660b02a19cd240"} Jan 20 19:25:08 crc kubenswrapper[4773]: I0120 19:25:08.950461 4773 generic.go:334] "Generic (PLEG): container finished" podID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" exitCode=0 Jan 20 19:25:08 crc kubenswrapper[4773]: I0120 19:25:08.950509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53"} Jan 20 19:25:10 crc kubenswrapper[4773]: I0120 19:25:10.984467 4773 generic.go:334] "Generic (PLEG): container finished" podID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" exitCode=0 Jan 20 19:25:10 crc kubenswrapper[4773]: I0120 19:25:10.984530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac"} Jan 20 19:25:13 crc kubenswrapper[4773]: I0120 19:25:13.003009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerStarted","Data":"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747"} Jan 20 19:25:13 crc kubenswrapper[4773]: I0120 19:25:13.026536 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rxnc7" podStartSLOduration=4.168866828 podStartE2EDuration="7.026514266s" podCreationTimestamp="2026-01-20 19:25:06 +0000 UTC" firstStartedPulling="2026-01-20 19:25:08.953979269 +0000 UTC m=+3301.875792293" lastFinishedPulling="2026-01-20 19:25:11.811626697 +0000 UTC m=+3304.733439731" observedRunningTime="2026-01-20 19:25:13.020797386 +0000 UTC m=+3305.942610410" watchObservedRunningTime="2026-01-20 19:25:13.026514266 +0000 UTC m=+3305.948327290" Jan 20 19:25:17 crc kubenswrapper[4773]: I0120 19:25:17.269196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:17 crc kubenswrapper[4773]: I0120 19:25:17.269816 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:17 crc kubenswrapper[4773]: I0120 19:25:17.320479 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:18 crc kubenswrapper[4773]: I0120 19:25:18.089318 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:18 crc kubenswrapper[4773]: I0120 19:25:18.140245 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.060018 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rxnc7" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" containerID="cri-o://c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" gracePeriod=2 Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.494415 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.614917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"840e8c66-c1a1-423a-ab61-21697ce5f35d\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.615067 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"840e8c66-c1a1-423a-ab61-21697ce5f35d\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.615153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"840e8c66-c1a1-423a-ab61-21697ce5f35d\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.615777 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities" (OuterVolumeSpecName: "utilities") pod "840e8c66-c1a1-423a-ab61-21697ce5f35d" (UID: "840e8c66-c1a1-423a-ab61-21697ce5f35d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.627085 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4" (OuterVolumeSpecName: "kube-api-access-6dlp4") pod "840e8c66-c1a1-423a-ab61-21697ce5f35d" (UID: "840e8c66-c1a1-423a-ab61-21697ce5f35d"). InnerVolumeSpecName "kube-api-access-6dlp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.718068 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.718105 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") on node \"crc\" DevicePath \"\"" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.726709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "840e8c66-c1a1-423a-ab61-21697ce5f35d" (UID: "840e8c66-c1a1-423a-ab61-21697ce5f35d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.820124 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080500 4773 generic.go:334] "Generic (PLEG): container finished" podID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" exitCode=0 Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080820 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747"} Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080851 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"885ff945d26cc2c0aa231b3527368210271a137dbd07e6bd6c660b02a19cd240"} Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080872 4773 scope.go:117] "RemoveContainer" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.081059 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.113223 4773 scope.go:117] "RemoveContainer" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.123081 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.130652 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.154045 4773 scope.go:117] "RemoveContainer" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.189492 4773 scope.go:117] "RemoveContainer" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" Jan 20 19:25:21 crc kubenswrapper[4773]: E0120 19:25:21.189911 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747\": container with ID starting with c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747 not found: ID does not exist" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190182 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747"} err="failed to get container status \"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747\": rpc error: code = NotFound desc = could not find container \"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747\": container with ID starting with c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747 not found: ID does not exist" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190213 4773 scope.go:117] "RemoveContainer" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" Jan 20 19:25:21 crc kubenswrapper[4773]: E0120 19:25:21.190465 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac\": container with ID starting with b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac not found: ID does not exist" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190488 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac"} err="failed to get container status \"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac\": rpc error: code = NotFound desc = could not find container \"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac\": container with ID starting with b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac not found: ID does not exist" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190502 4773 scope.go:117] "RemoveContainer" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" Jan 20 19:25:21 crc kubenswrapper[4773]: E0120 19:25:21.191461 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53\": container with ID starting with 8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53 not found: ID does not exist" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.191525 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53"} err="failed to get container status \"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53\": rpc error: code = NotFound desc = could not find container \"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53\": container with ID starting with 8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53 not found: ID does not exist" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.462630 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" path="/var/lib/kubelet/pods/840e8c66-c1a1-423a-ab61-21697ce5f35d/volumes" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.782260 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:18 crc kubenswrapper[4773]: E0120 19:26:18.783301 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-content" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783318 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-content" Jan 20 19:26:18 crc kubenswrapper[4773]: E0120 19:26:18.783345 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-utilities" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783355 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-utilities" Jan 20 19:26:18 crc kubenswrapper[4773]: E0120 19:26:18.783373 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783383 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783656 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.785238 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.792877 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.945339 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.945584 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.945748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048333 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048888 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048907 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.069288 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.116525 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.599354 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.577153 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" exitCode=0 Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.577243 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763"} Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.577463 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerStarted","Data":"a6a22ad3e980bdfc6cc2b08d94bf77888d412aa3bd57fac6d02fb9d83ab48b57"} Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.578978 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:26:22 crc kubenswrapper[4773]: I0120 19:26:22.597538 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" exitCode=0 Jan 20 19:26:22 crc kubenswrapper[4773]: I0120 19:26:22.597766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23"} Jan 20 19:26:23 crc kubenswrapper[4773]: I0120 19:26:23.611313 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerStarted","Data":"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470"} Jan 20 19:26:23 crc kubenswrapper[4773]: I0120 19:26:23.632861 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wp4jg" podStartSLOduration=3.228896666 podStartE2EDuration="5.632842673s" podCreationTimestamp="2026-01-20 19:26:18 +0000 UTC" firstStartedPulling="2026-01-20 19:26:20.57871681 +0000 UTC m=+3373.500529844" lastFinishedPulling="2026-01-20 19:26:22.982662827 +0000 UTC m=+3375.904475851" observedRunningTime="2026-01-20 19:26:23.626760474 +0000 UTC m=+3376.548573498" watchObservedRunningTime="2026-01-20 19:26:23.632842673 +0000 UTC m=+3376.554655697" Jan 20 19:26:28 crc kubenswrapper[4773]: I0120 19:26:28.169706 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:26:28 crc kubenswrapper[4773]: I0120 19:26:28.170279 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.116673 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.117030 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.176555 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.712043 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.758445 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:31 crc kubenswrapper[4773]: I0120 19:26:31.685201 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wp4jg" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" containerID="cri-o://c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" gracePeriod=2 Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.176568 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.295141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.295317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.295370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.296162 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities" (OuterVolumeSpecName: "utilities") pod "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" (UID: "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.300200 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq" (OuterVolumeSpecName: "kube-api-access-mtvsq") pod "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" (UID: "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2"). InnerVolumeSpecName "kube-api-access-mtvsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.323446 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" (UID: "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.397780 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.397822 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") on node \"crc\" DevicePath \"\"" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.397836 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.696790 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" exitCode=0 Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.696882 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.696887 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470"} Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.698023 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"a6a22ad3e980bdfc6cc2b08d94bf77888d412aa3bd57fac6d02fb9d83ab48b57"} Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.698070 4773 scope.go:117] "RemoveContainer" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.738236 4773 scope.go:117] "RemoveContainer" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.768903 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.777817 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.784224 4773 scope.go:117] "RemoveContainer" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.817048 4773 scope.go:117] "RemoveContainer" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" Jan 20 19:26:32 crc kubenswrapper[4773]: E0120 19:26:32.817543 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470\": container with ID starting with c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470 not found: ID does not exist" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.817577 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470"} err="failed to get container status \"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470\": rpc error: code = NotFound desc = could not find container \"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470\": container with ID starting with c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470 not found: ID does not exist" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.817604 4773 scope.go:117] "RemoveContainer" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" Jan 20 19:26:32 crc kubenswrapper[4773]: E0120 19:26:32.818077 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23\": container with ID starting with c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23 not found: ID does not exist" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.818127 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23"} err="failed to get container status \"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23\": rpc error: code = NotFound desc = could not find container \"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23\": container with ID starting with c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23 not found: ID does not exist" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.818159 4773 scope.go:117] "RemoveContainer" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" Jan 20 19:26:32 crc kubenswrapper[4773]: E0120 19:26:32.819212 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763\": container with ID starting with 4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763 not found: ID does not exist" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.819268 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763"} err="failed to get container status \"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763\": rpc error: code = NotFound desc = could not find container \"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763\": container with ID starting with 4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763 not found: ID does not exist" Jan 20 19:26:33 crc kubenswrapper[4773]: I0120 19:26:33.459904 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" path="/var/lib/kubelet/pods/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2/volumes" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.545383 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:26:48 crc kubenswrapper[4773]: E0120 19:26:48.546278 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-utilities" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546291 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-utilities" Jan 20 19:26:48 crc kubenswrapper[4773]: E0120 19:26:48.546301 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546307 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" Jan 20 19:26:48 crc kubenswrapper[4773]: E0120 19:26:48.546338 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-content" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546345 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-content" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546504 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.547588 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.550241 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mzdmf"/"openshift-service-ca.crt" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.550465 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mzdmf"/"kube-root-ca.crt" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.557585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.557686 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.567499 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.658957 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.659162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.659583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.688766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.867334 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:49 crc kubenswrapper[4773]: I0120 19:26:49.312654 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:26:49 crc kubenswrapper[4773]: W0120 19:26:49.321912 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae725e5b_de4d_443b_bd8c_985abdcb0f87.slice/crio-f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9 WatchSource:0}: Error finding container f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9: Status 404 returned error can't find the container with id f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9 Jan 20 19:26:49 crc kubenswrapper[4773]: I0120 19:26:49.843875 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerStarted","Data":"f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9"} Jan 20 19:26:56 crc kubenswrapper[4773]: I0120 19:26:56.926393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerStarted","Data":"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124"} Jan 20 19:26:57 crc kubenswrapper[4773]: I0120 19:26:57.939834 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerStarted","Data":"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95"} Jan 20 19:26:57 crc kubenswrapper[4773]: I0120 19:26:57.957474 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mzdmf/must-gather-lp22t" podStartSLOduration=2.642644924 podStartE2EDuration="9.95745883s" podCreationTimestamp="2026-01-20 19:26:48 +0000 UTC" firstStartedPulling="2026-01-20 19:26:49.325011889 +0000 UTC m=+3402.246824913" lastFinishedPulling="2026-01-20 19:26:56.639825795 +0000 UTC m=+3409.561638819" observedRunningTime="2026-01-20 19:26:57.956381733 +0000 UTC m=+3410.878194757" watchObservedRunningTime="2026-01-20 19:26:57.95745883 +0000 UTC m=+3410.879271854" Jan 20 19:26:58 crc kubenswrapper[4773]: I0120 19:26:58.170248 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:26:58 crc kubenswrapper[4773]: I0120 19:26:58.170296 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.890463 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-txz6z"] Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.892180 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.894447 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mzdmf"/"default-dockercfg-6z6zw" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.943119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.943187 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.045928 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.046016 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.046073 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.065583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.213666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: W0120 19:27:01.251753 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630b0da4_d7f7_4f6e_8489_66087b5b8974.slice/crio-f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407 WatchSource:0}: Error finding container f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407: Status 404 returned error can't find the container with id f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407 Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.975246 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" event={"ID":"630b0da4-d7f7-4f6e-8489-66087b5b8974","Type":"ContainerStarted","Data":"f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407"} Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.316616 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7574cb8f94-wwkgd_436dcd32-51a0-4a9e-8a0a-fb852a5de1f0/barbican-api-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.328050 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7574cb8f94-wwkgd_436dcd32-51a0-4a9e-8a0a-fb852a5de1f0/barbican-api/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.352247 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7854d7cd94-r9cm7_8839acb4-5db9-4b47-a075-8798d8a01c6b/barbican-keystone-listener-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.358305 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7854d7cd94-r9cm7_8839acb4-5db9-4b47-a075-8798d8a01c6b/barbican-keystone-listener/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.378476 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69f4d99ff7-gmlhl_52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782/barbican-worker-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.387783 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69f4d99ff7-gmlhl_52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782/barbican-worker/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.457401 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8_586f1b07-ae25-4acf-8a65-92377c4db234/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.486455 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/ceilometer-central-agent/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.508500 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/ceilometer-notification-agent/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.516924 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/sg-core/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.523934 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/proxy-httpd/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.541364 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx_2cefaa80-8ba4-4e73-81e3-927c47cc2a5d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.565330 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7_e1492d77-23f5-4ed0-9511-e5b4ee1107c7/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.590973 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d4d69bee-fde2-4fb6-95f6-74e35b8d5db5/cinder-api-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.633894 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d4d69bee-fde2-4fb6-95f6-74e35b8d5db5/cinder-api/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.804339 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_4a053127-e129-429c-9a7b-28e084c34269/cinder-backup/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.814584 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_4a053127-e129-429c-9a7b-28e084c34269/probe/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.854972 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3e6b840-22c8-4add-b022-1ba197ca588c/cinder-scheduler/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.881881 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3e6b840-22c8-4add-b022-1ba197ca588c/probe/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.947452 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_466685e0-d49e-4d97-9436-7db7c10062c3/cinder-volume/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.962018 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_466685e0-d49e-4d97-9436-7db7c10062c3/probe/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.987727 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm_a290d892-d26b-4f1c-b4a0-9778e6b58c7b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.055035 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5_b3ce8585-331b-44ef-b8f8-aa5cb3b96589/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.074911 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-d8qrf_3b08b301-686b-45e6-9903-5df8a754a16a/dnsmasq-dns/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.080705 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-d8qrf_3b08b301-686b-45e6-9903-5df8a754a16a/init/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.097350 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_696e3ee3-25fa-4102-b483-1781d00bb18f/glance-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.112831 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_696e3ee3-25fa-4102-b483-1781d00bb18f/glance-httpd/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.133684 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_117c4f3b-d438-4f73-966c-378c28f67460/glance-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.145898 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_117c4f3b-d438-4f73-966c-378c28f67460/glance-httpd/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.450929 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fb89f56b-287lx_cd9ba14c-8dca-4170-841c-6f5d5fa2b220/horizon-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.530955 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fb89f56b-287lx_cd9ba14c-8dca-4170-841c-6f5d5fa2b220/horizon/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.549876 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-lshfv_a459169f-671f-4dd7-96d3-019d59bd14c6/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.573043 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-h7lxl_00dc0471-09f0-4cdf-a237-aba1d232cf04/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.695163 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5bdd8cdbd7-xhf92_03658323-86f4-42ec-b18f-163a1e7dcaed/keystone-api/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.703679 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29482261-dqww9_5b0951c0-055b-44bd-a686-9a4938af6b4f/keystone-cron/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.715851 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea/kube-state-metrics/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.756457 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9_5e1b8272-3f37-405c-9f7c-acc1dd855d60/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.770573 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6e180830-62c9-4473-9d6b-197fbe92af49/manila-api-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.847808 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6e180830-62c9-4473-9d6b-197fbe92af49/manila-api/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.862000 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-c4ba-account-create-update-47gmh_80285eae-2998-47ab-bcd6-e9905e2e71d4/mariadb-account-create-update/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.874840 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-create-5vk9g_33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6/mariadb-database-create/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.900040 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-sync-r2zvz_32b245ce-84e1-4fbc-adef-ebfdd1e88d77/manila-db-sync/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.006147 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e42acb5b-abbc-4f06-918d-2e886b50146e/manila-scheduler/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.011389 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e42acb5b-abbc-4f06-918d-2e886b50146e/probe/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.052365 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a7af9581-1520-466a-8b8f-1b957274273e/manila-share/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.072923 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a7af9581-1520-466a-8b8f-1b957274273e/probe/0.log" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.211619 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" event={"ID":"630b0da4-d7f7-4f6e-8489-66087b5b8974","Type":"ContainerStarted","Data":"850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82"} Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.233725 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" podStartSLOduration=1.775508852 podStartE2EDuration="16.233701505s" podCreationTimestamp="2026-01-20 19:27:00 +0000 UTC" firstStartedPulling="2026-01-20 19:27:01.254016958 +0000 UTC m=+3414.175829982" lastFinishedPulling="2026-01-20 19:27:15.712209611 +0000 UTC m=+3428.634022635" observedRunningTime="2026-01-20 19:27:16.227167155 +0000 UTC m=+3429.148980179" watchObservedRunningTime="2026-01-20 19:27:16.233701505 +0000 UTC m=+3429.155514529" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.511673 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cb8cda87-65c5-4be7-9891-b82bcfc8e0d4/memcached/0.log" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.555639 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cffc5d9-m6wn7_f98c94f3-5e79-4d1a-9e1f-bab68689f193/neutron-api/0.log" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.571101 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cffc5d9-m6wn7_f98c94f3-5e79-4d1a-9e1f-bab68689f193/neutron-httpd/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.072651 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc_f735fea9-67a7-4dcc-96f9-8e852df016ce/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.160705 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f890481e-0c9f-4194-8af3-d808bb105995/nova-api-log/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.375087 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f890481e-0c9f-4194-8af3-d808bb105995/nova-api-api/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.474049 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b123d99d-6cf6-4516-a5ae-7dcdf8262269/nova-cell0-conductor-conductor/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.586810 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7970e552-0aac-436b-ba20-4810e82dcd20/nova-cell1-conductor-conductor/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.662979 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7c5b56a3-1c91-4347-ae44-63f05c35e134/nova-cell1-novncproxy-novncproxy/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.714617 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d_e7bfe1d6-9e6c-4964-9cdf-2204156f14c6/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.782188 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ceeec9e1-d0f5-497c-b262-2ef81be261ee/nova-metadata-log/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.511887 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ceeec9e1-d0f5-497c-b262-2ef81be261ee/nova-metadata-metadata/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.602897 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_8413ef33-749f-4413-9965-fd19ad70ebfc/nova-scheduler-scheduler/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.629901 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bfe9133c-0d58-4877-97ee-5b0abeee1a95/galera/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.642450 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bfe9133c-0d58-4877-97ee-5b0abeee1a95/mysql-bootstrap/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.685663 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_11b243ca-6da3-4247-a1fe-2ea3e5be80cc/galera/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.705552 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_11b243ca-6da3-4247-a1fe-2ea3e5be80cc/mysql-bootstrap/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.716546 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f040c75f-a2cb-4bfe-9fd1-0105887fa6b4/openstackclient/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.735363 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xs9zd_a5ceb1c5-1dbc-4810-95c9-c1ac0b915542/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.756642 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5gcvm_bada64ed-c7da-4bd9-9195-75bbdcdd0406/ovsdb-server/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.768303 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5gcvm_bada64ed-c7da-4bd9-9195-75bbdcdd0406/ovs-vswitchd/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.775228 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5gcvm_bada64ed-c7da-4bd9-9195-75bbdcdd0406/ovsdb-server-init/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.786943 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-t5h8j_2fce4eb9-f614-4050-a099-0a743695dcd9/ovn-controller/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.820533 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-cjrtn_de805082-3188-4adb-9607-4ec5535de661/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.830903 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_152ecb39-d580-4c8d-b572-e3a6bb070c7f/ovn-northd/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.838905 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_152ecb39-d580-4c8d-b572-e3a6bb070c7f/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.853383 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5818e5c4-9a2c-453f-b158-f4be5ec40619/ovsdbserver-nb/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.862605 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5818e5c4-9a2c-453f-b158-f4be5ec40619/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.888968 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4c900f03-61d3-470c-9803-3f6b617ddf0a/ovsdbserver-sb/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.896044 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4c900f03-61d3-470c-9803-3f6b617ddf0a/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.945968 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-668885694d-2br7g_a7bad355-1a37-4372-9751-25a39f6a3410/placement-log/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.977078 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-668885694d-2br7g_a7bad355-1a37-4372-9751-25a39f6a3410/placement-api/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.997965 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_35926f65-848d-4db5-b50a-deef510ce4be/rabbitmq/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.003916 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_35926f65-848d-4db5-b50a-deef510ce4be/setup-container/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.033688 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_375735e1-5d2a-4cc8-892b-4bdcdf9f1e42/rabbitmq/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.038665 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_375735e1-5d2a-4cc8-892b-4bdcdf9f1e42/setup-container/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.057967 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst_f6ebb133-0720-46a2-9da3-ec9dc396266b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.074412 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f_2ce9c199-1f55-4aea-82f7-5df21339c927/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.090169 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ppqxn_617e6a58-e676-42e3-a897-939d9072d030/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.104511 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ll277_08ee5bdf-bc91-4f34-8459-bc65419f93d7/ssh-known-hosts-edpm-deployment/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.118200 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl_9114481d-74c0-4af1-9bed-3f592f2c102f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:24 crc kubenswrapper[4773]: I0120 19:27:24.870629 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/controller/0.log" Jan 20 19:27:24 crc kubenswrapper[4773]: I0120 19:27:24.878016 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/kube-rbac-proxy/0.log" Jan 20 19:27:24 crc kubenswrapper[4773]: I0120 19:27:24.900358 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/controller/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.596568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.604375 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/reloader/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.611020 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr-metrics/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.620555 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.626703 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy-frr/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.635993 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-frr-files/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.643385 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-reloader/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.651390 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-metrics/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.666434 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-vv9kz_05a83b70-ac51-4951-92c6-0f90265f2958/frr-k8s-webhook-server/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.692134 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58b89bff97-7jrvm_a93bbf26-2683-4cf0-a45a-1639d6da4e01/manager/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.701193 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbdbfd488-hxn9x_cbba9cb2-22ec-4f8c-8550-f3a69901785c/webhook-server/0.log" Jan 20 19:27:27 crc kubenswrapper[4773]: I0120 19:27:27.042876 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/speaker/0.log" Jan 20 19:27:27 crc kubenswrapper[4773]: I0120 19:27:27.048813 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/kube-rbac-proxy/0.log" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.170666 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.171115 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.171187 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.172277 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.172387 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02" gracePeriod=600 Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.305735 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02" exitCode=0 Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.305780 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02"} Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.305819 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.316393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d"} Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.655845 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/extract/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.673116 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/util/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.682597 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/pull/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.856814 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-hhxlp_df2d6d5b-b964-4672-903f-563b7792ee43/manager/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.909836 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-xmljc_48aacb32-c120-4f36-898b-60f5d01c5510/manager/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.921070 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-v4q7f_ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.171367 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4kk2r_4604c39e-62d8-4420-b2bc-54d44f4ebcd0/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.196025 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-vjfdq_a570d5a5-53f4-444f-a14d-92ea24f27e2e/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.321986 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-blxqv_d1051db2-8914-422b-a126-5cd8ee078767/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.809560 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-hqsb9_437cadd4-5809-4b9e-afa2-05832cd6c303/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.825255 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-2nhdr_951d4f5c-5d89-41c6-be8a-9828b05ce182/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.898464 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-8tsjs_b773ecb8-3505-44ad-a28f-bd4054263888/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.950042 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-mqjmm_ed6d3389-b374-42a6-8101-1d34df737170/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.021654 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-s7scg_8f795216-0196-4a5a-bfdf-20dee1543b43/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.099888 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-hfwzv_b196e443-f058-49c2-b54b-a18656415f5a/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.192277 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-prhbl_fb5406b5-d194-441a-a098-7ecdc7831ec1/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.210008 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-sslnl_ff53e5c0-255a-43c5-a27c-ce9dc3145999/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.232397 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854m7knp_e9f6d4b3-c2cc-4cc6-b279-362e7439974b/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.336206 4773 generic.go:334] "Generic (PLEG): container finished" podID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerID="850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82" exitCode=0 Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.336260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" event={"ID":"630b0da4-d7f7-4f6e-8489-66087b5b8974","Type":"ContainerDied","Data":"850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82"} Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.397756 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5df999bcf5-pztzb_e2d598ad-b9fa-4874-8669-688e18171e82/operator/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.451381 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.500065 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-txz6z"] Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.519234 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-txz6z"] Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.577233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"630b0da4-d7f7-4f6e-8489-66087b5b8974\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.577360 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"630b0da4-d7f7-4f6e-8489-66087b5b8974\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.577574 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host" (OuterVolumeSpecName: "host") pod "630b0da4-d7f7-4f6e-8489-66087b5b8974" (UID: "630b0da4-d7f7-4f6e-8489-66087b5b8974"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.578250 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.588216 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7" (OuterVolumeSpecName: "kube-api-access-j5cm7") pod "630b0da4-d7f7-4f6e-8489-66087b5b8974" (UID: "630b0da4-d7f7-4f6e-8489-66087b5b8974"). InnerVolumeSpecName "kube-api-access-j5cm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.680036 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.818387 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-674cd49df-nnf4r_86d68359-5910-4d1d-8a01-2964f8d26464/manager/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.827228 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zjdsq_ba4d7dc5-ceca-4e4b-81af-9368937b7462/registry-server/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.887419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-6ngwx_a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3/manager/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.922346 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-26j8t_2601732b-921a-4c55-821b-0fc994c50236/manager/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.942991 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5qgh_99558a40-3dbc-4c2b-9aab-a085c7ef5c7c/operator/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.958984 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-t8tmg_9e235ee6-33ad-40e3-9b7a-914820315627/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.049321 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-2thqw_7ed73202-faba-46ba-ae91-8cd9ffbe70a4/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.058022 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7f4549b895-p2vwt_cfba823f-e85e-42ae-aa8a-7926cc906b92/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.068858 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-nhqxg_7f740208-043d-4d7f-b533-5526833d10c2/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.353460 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.353541 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.460742 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" path="/var/lib/kubelet/pods/630b0da4-d7f7-4f6e-8489-66087b5b8974/volumes" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.675577 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-gfrv9"] Jan 20 19:27:33 crc kubenswrapper[4773]: E0120 19:27:33.676240 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerName="container-00" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.676343 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerName="container-00" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.676593 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerName="container-00" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.677431 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.679696 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mzdmf"/"default-dockercfg-6z6zw" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.799254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.799624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.901974 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.902101 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.902291 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.926515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.992945 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:34 crc kubenswrapper[4773]: W0120 19:27:34.039352 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec34924_f3b4_407a_ae92_3e002b13c954.slice/crio-182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663 WatchSource:0}: Error finding container 182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663: Status 404 returned error can't find the container with id 182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663 Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.371646 4773 generic.go:334] "Generic (PLEG): container finished" podID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerID="9bfb4ee2f9a7a03638cbc3d613dcbada38edd065d9b016e140a975b051ddfb05" exitCode=1 Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.371737 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" event={"ID":"5ec34924-f3b4-407a-ae92-3e002b13c954","Type":"ContainerDied","Data":"9bfb4ee2f9a7a03638cbc3d613dcbada38edd065d9b016e140a975b051ddfb05"} Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.372099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" event={"ID":"5ec34924-f3b4-407a-ae92-3e002b13c954","Type":"ContainerStarted","Data":"182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663"} Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.412737 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-gfrv9"] Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.427617 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-gfrv9"] Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.485983 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"5ec34924-f3b4-407a-ae92-3e002b13c954\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631210 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host" (OuterVolumeSpecName: "host") pod "5ec34924-f3b4-407a-ae92-3e002b13c954" (UID: "5ec34924-f3b4-407a-ae92-3e002b13c954"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631305 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"5ec34924-f3b4-407a-ae92-3e002b13c954\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631851 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.643271 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb" (OuterVolumeSpecName: "kube-api-access-4khzb") pod "5ec34924-f3b4-407a-ae92-3e002b13c954" (UID: "5ec34924-f3b4-407a-ae92-3e002b13c954"). InnerVolumeSpecName "kube-api-access-4khzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.754767 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.397052 4773 scope.go:117] "RemoveContainer" containerID="9bfb4ee2f9a7a03638cbc3d613dcbada38edd065d9b016e140a975b051ddfb05" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.397501 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.708447 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-857hw_1e5ac136-d46c-45e3-9a5f-548ac22fac5c/control-plane-machine-set-operator/0.log" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.726379 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/kube-rbac-proxy/0.log" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.736508 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/machine-api-operator/0.log" Jan 20 19:27:37 crc kubenswrapper[4773]: I0120 19:27:37.458308 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" path="/var/lib/kubelet/pods/5ec34924-f3b4-407a-ae92-3e002b13c954/volumes" Jan 20 19:28:18 crc kubenswrapper[4773]: I0120 19:28:18.856197 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tgrsg_5a2416cd-d7d8-4aa5-b7ef-1b61446a4072/cert-manager-controller/0.log" Jan 20 19:28:18 crc kubenswrapper[4773]: I0120 19:28:18.872698 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mtkdb_c249258b-878c-45b8-9886-6fee2afec18c/cert-manager-cainjector/0.log" Jan 20 19:28:18 crc kubenswrapper[4773]: I0120 19:28:18.887836 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cf2ql_4380dd47-7110-43ea-af85-02675b558a8d/cert-manager-webhook/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.758286 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-z2s7z_431c5397-9244-4083-9659-59210fd6d5c0/nmstate-console-plugin/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.771283 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q9h42_ef435627-8918-4451-8d3a-23e494e29f56/nmstate-handler/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.795647 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/nmstate-metrics/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.804766 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/kube-rbac-proxy/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.817320 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-bwjbw_a0e928f6-ac84-4903-ab0e-08557dea077f/nmstate-operator/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.827430 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-8q4jc_9380b21a-b971-4bb9-9572-d795f171b941/nmstate-webhook/0.log" Jan 20 19:28:34 crc kubenswrapper[4773]: I0120 19:28:34.164641 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/controller/0.log" Jan 20 19:28:34 crc kubenswrapper[4773]: I0120 19:28:34.172151 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/kube-rbac-proxy/0.log" Jan 20 19:28:34 crc kubenswrapper[4773]: I0120 19:28:34.190796 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/controller/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.558342 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.570281 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/reloader/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.574323 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr-metrics/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.581166 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.587505 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy-frr/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.592860 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-frr-files/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.599655 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-reloader/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.607162 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-metrics/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.618232 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-vv9kz_05a83b70-ac51-4951-92c6-0f90265f2958/frr-k8s-webhook-server/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.642227 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58b89bff97-7jrvm_a93bbf26-2683-4cf0-a45a-1639d6da4e01/manager/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.653002 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbdbfd488-hxn9x_cbba9cb2-22ec-4f8c-8550-f3a69901785c/webhook-server/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.953827 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/speaker/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.961283 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/kube-rbac-proxy/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.326884 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g_8e0b8536-2fc2-4203-a22f-a2dc29d0b737/extract/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.333497 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g_8e0b8536-2fc2-4203-a22f-a2dc29d0b737/util/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.342509 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g_8e0b8536-2fc2-4203-a22f-a2dc29d0b737/pull/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.354815 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd_7c79fd0a-1d41-44db-8ee4-d5781d77e848/extract/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.363168 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd_7c79fd0a-1d41-44db-8ee4-d5781d77e848/util/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.375152 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd_7c79fd0a-1d41-44db-8ee4-d5781d77e848/pull/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.155481 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnll4_5da64480-a8e7-4ab9-b438-dfe067f94091/registry-server/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.162279 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnll4_5da64480-a8e7-4ab9-b438-dfe067f94091/extract-utilities/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.170112 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnll4_5da64480-a8e7-4ab9-b438-dfe067f94091/extract-content/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.629106 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cfwbf_a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3/registry-server/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.634415 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cfwbf_a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3/extract-utilities/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.641201 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cfwbf_a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3/extract-content/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.661744 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kcc74_785e6f78-9a81-429e-8cad-f60275661e58/marketplace-operator/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.835679 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wdwbg_379f8421-1b6c-45c5-ae56-051b42ff6410/registry-server/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.840559 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wdwbg_379f8421-1b6c-45c5-ae56-051b42ff6410/extract-utilities/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.847548 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wdwbg_379f8421-1b6c-45c5-ae56-051b42ff6410/extract-content/0.log" Jan 20 19:28:41 crc kubenswrapper[4773]: I0120 19:28:41.400431 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r24nn_7962399c-d4d0-44f1-a788-bd4cb5a758d7/registry-server/0.log" Jan 20 19:28:41 crc kubenswrapper[4773]: I0120 19:28:41.405511 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r24nn_7962399c-d4d0-44f1-a788-bd4cb5a758d7/extract-utilities/0.log" Jan 20 19:28:41 crc kubenswrapper[4773]: I0120 19:28:41.413572 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r24nn_7962399c-d4d0-44f1-a788-bd4cb5a758d7/extract-content/0.log" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.913873 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:24 crc kubenswrapper[4773]: E0120 19:29:24.914732 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerName="container-00" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.914745 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerName="container-00" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.914969 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerName="container-00" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.916220 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.943577 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.973576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.973703 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.973992 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.076287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.076462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.076529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.077016 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.077055 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.105658 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.238626 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.857141 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:26 crc kubenswrapper[4773]: I0120 19:29:26.280423 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" exitCode=0 Jan 20 19:29:26 crc kubenswrapper[4773]: I0120 19:29:26.280523 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e"} Jan 20 19:29:26 crc kubenswrapper[4773]: I0120 19:29:26.282571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerStarted","Data":"6678b7da17cb66288f2199f90a24c29f56c670b8a88a035f6107e08d3e5e1e1e"} Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.170456 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.171033 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.297902 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" exitCode=0 Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.297972 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056"} Jan 20 19:29:29 crc kubenswrapper[4773]: I0120 19:29:29.308339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerStarted","Data":"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720"} Jan 20 19:29:29 crc kubenswrapper[4773]: I0120 19:29:29.327380 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2c6jv" podStartSLOduration=2.581570052 podStartE2EDuration="5.327355824s" podCreationTimestamp="2026-01-20 19:29:24 +0000 UTC" firstStartedPulling="2026-01-20 19:29:26.282368346 +0000 UTC m=+3559.204181370" lastFinishedPulling="2026-01-20 19:29:29.028154118 +0000 UTC m=+3561.949967142" observedRunningTime="2026-01-20 19:29:29.325202482 +0000 UTC m=+3562.247015526" watchObservedRunningTime="2026-01-20 19:29:29.327355824 +0000 UTC m=+3562.249168848" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.239605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.240206 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.294955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.409102 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.542077 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:37 crc kubenswrapper[4773]: I0120 19:29:37.379015 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2c6jv" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" containerID="cri-o://5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" gracePeriod=2 Jan 20 19:29:37 crc kubenswrapper[4773]: I0120 19:29:37.893770 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.040151 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.040227 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.040491 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.044565 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities" (OuterVolumeSpecName: "utilities") pod "520f286f-ac9f-40aa-939b-2a4cd53ebbd0" (UID: "520f286f-ac9f-40aa-939b-2a4cd53ebbd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.050217 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk" (OuterVolumeSpecName: "kube-api-access-tn5dk") pod "520f286f-ac9f-40aa-939b-2a4cd53ebbd0" (UID: "520f286f-ac9f-40aa-939b-2a4cd53ebbd0"). InnerVolumeSpecName "kube-api-access-tn5dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.111361 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "520f286f-ac9f-40aa-939b-2a4cd53ebbd0" (UID: "520f286f-ac9f-40aa-939b-2a4cd53ebbd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.142635 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.142673 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") on node \"crc\" DevicePath \"\"" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.142687 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.388984 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" exitCode=0 Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389030 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720"} Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389056 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"6678b7da17cb66288f2199f90a24c29f56c670b8a88a035f6107e08d3e5e1e1e"} Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389063 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389076 4773 scope.go:117] "RemoveContainer" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.407797 4773 scope.go:117] "RemoveContainer" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.446821 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.448887 4773 scope.go:117] "RemoveContainer" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.463503 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.502351 4773 scope.go:117] "RemoveContainer" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" Jan 20 19:29:38 crc kubenswrapper[4773]: E0120 19:29:38.502816 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720\": container with ID starting with 5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720 not found: ID does not exist" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.502858 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720"} err="failed to get container status \"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720\": rpc error: code = NotFound desc = could not find container \"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720\": container with ID starting with 5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720 not found: ID does not exist" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.502885 4773 scope.go:117] "RemoveContainer" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" Jan 20 19:29:38 crc kubenswrapper[4773]: E0120 19:29:38.503418 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056\": container with ID starting with 2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056 not found: ID does not exist" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.503463 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056"} err="failed to get container status \"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056\": rpc error: code = NotFound desc = could not find container \"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056\": container with ID starting with 2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056 not found: ID does not exist" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.503504 4773 scope.go:117] "RemoveContainer" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" Jan 20 19:29:38 crc kubenswrapper[4773]: E0120 19:29:38.503902 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e\": container with ID starting with c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e not found: ID does not exist" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.503969 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e"} err="failed to get container status \"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e\": rpc error: code = NotFound desc = could not find container \"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e\": container with ID starting with c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e not found: ID does not exist" Jan 20 19:29:39 crc kubenswrapper[4773]: I0120 19:29:39.458245 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" path="/var/lib/kubelet/pods/520f286f-ac9f-40aa-939b-2a4cd53ebbd0/volumes" Jan 20 19:29:50 crc kubenswrapper[4773]: I0120 19:29:50.898967 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/controller/0.log" Jan 20 19:29:50 crc kubenswrapper[4773]: I0120 19:29:50.931841 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/kube-rbac-proxy/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.022526 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/controller/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.233388 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tgrsg_5a2416cd-d7d8-4aa5-b7ef-1b61446a4072/cert-manager-controller/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.252729 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mtkdb_c249258b-878c-45b8-9886-6fee2afec18c/cert-manager-cainjector/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.271994 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cf2ql_4380dd47-7110-43ea-af85-02675b558a8d/cert-manager-webhook/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.332366 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.341582 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/reloader/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.346675 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr-metrics/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.354826 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.361285 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy-frr/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.373889 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-frr-files/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.382406 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-reloader/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.390718 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-metrics/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.392144 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/extract/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.398672 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/util/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.404765 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-vv9kz_05a83b70-ac51-4951-92c6-0f90265f2958/frr-k8s-webhook-server/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.405268 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/pull/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.460968 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58b89bff97-7jrvm_a93bbf26-2683-4cf0-a45a-1639d6da4e01/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.472278 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbdbfd488-hxn9x_cbba9cb2-22ec-4f8c-8550-f3a69901785c/webhook-server/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.513426 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-hhxlp_df2d6d5b-b964-4672-903f-563b7792ee43/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.609685 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-xmljc_48aacb32-c120-4f36-898b-60f5d01c5510/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.626206 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-v4q7f_ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.722833 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4kk2r_4604c39e-62d8-4420-b2bc-54d44f4ebcd0/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.753123 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-vjfdq_a570d5a5-53f4-444f-a14d-92ea24f27e2e/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.790883 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-blxqv_d1051db2-8914-422b-a126-5cd8ee078767/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.938901 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/speaker/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.948419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/kube-rbac-proxy/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.155794 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-hqsb9_437cadd4-5809-4b9e-afa2-05832cd6c303/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.165033 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-2nhdr_951d4f5c-5d89-41c6-be8a-9828b05ce182/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.247049 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-8tsjs_b773ecb8-3505-44ad-a28f-bd4054263888/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.303748 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-mqjmm_ed6d3389-b374-42a6-8101-1d34df737170/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.343813 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-s7scg_8f795216-0196-4a5a-bfdf-20dee1543b43/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.402815 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-hfwzv_b196e443-f058-49c2-b54b-a18656415f5a/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.532972 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-prhbl_fb5406b5-d194-441a-a098-7ecdc7831ec1/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.546614 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-sslnl_ff53e5c0-255a-43c5-a27c-ce9dc3145999/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.562871 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854m7knp_e9f6d4b3-c2cc-4cc6-b279-362e7439974b/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.756568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5df999bcf5-pztzb_e2d598ad-b9fa-4874-8669-688e18171e82/operator/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.107224 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tgrsg_5a2416cd-d7d8-4aa5-b7ef-1b61446a4072/cert-manager-controller/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.123981 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mtkdb_c249258b-878c-45b8-9886-6fee2afec18c/cert-manager-cainjector/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.137374 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cf2ql_4380dd47-7110-43ea-af85-02675b558a8d/cert-manager-webhook/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.852282 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-857hw_1e5ac136-d46c-45e3-9a5f-548ac22fac5c/control-plane-machine-set-operator/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.868060 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/kube-rbac-proxy/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.876585 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/machine-api-operator/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.067508 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-674cd49df-nnf4r_86d68359-5910-4d1d-8a01-2964f8d26464/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.078808 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zjdsq_ba4d7dc5-ceca-4e4b-81af-9368937b7462/registry-server/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.132653 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-6ngwx_a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.155542 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-26j8t_2601732b-921a-4c55-821b-0fc994c50236/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.175234 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5qgh_99558a40-3dbc-4c2b-9aab-a085c7ef5c7c/operator/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.183908 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-t8tmg_9e235ee6-33ad-40e3-9b7a-914820315627/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.266897 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-2thqw_7ed73202-faba-46ba-ae91-8cd9ffbe70a4/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.275962 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7f4549b895-p2vwt_cfba823f-e85e-42ae-aa8a-7926cc906b92/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.286137 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-nhqxg_7f740208-043d-4d7f-b533-5526833d10c2/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.616341 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/extract/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.621436 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/util/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.630073 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/pull/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.711506 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-hhxlp_df2d6d5b-b964-4672-903f-563b7792ee43/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.765631 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-xmljc_48aacb32-c120-4f36-898b-60f5d01c5510/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.778443 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-v4q7f_ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.827703 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4kk2r_4604c39e-62d8-4420-b2bc-54d44f4ebcd0/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.837443 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-vjfdq_a570d5a5-53f4-444f-a14d-92ea24f27e2e/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.864476 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-blxqv_d1051db2-8914-422b-a126-5cd8ee078767/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.237695 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-hqsb9_437cadd4-5809-4b9e-afa2-05832cd6c303/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.251647 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-2nhdr_951d4f5c-5d89-41c6-be8a-9828b05ce182/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.332728 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-8tsjs_b773ecb8-3505-44ad-a28f-bd4054263888/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.374989 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-mqjmm_ed6d3389-b374-42a6-8101-1d34df737170/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.406685 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-s7scg_8f795216-0196-4a5a-bfdf-20dee1543b43/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.451708 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-hfwzv_b196e443-f058-49c2-b54b-a18656415f5a/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.526265 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-prhbl_fb5406b5-d194-441a-a098-7ecdc7831ec1/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.538033 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-sslnl_ff53e5c0-255a-43c5-a27c-ce9dc3145999/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.554751 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854m7knp_e9f6d4b3-c2cc-4cc6-b279-362e7439974b/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.678440 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-z2s7z_431c5397-9244-4083-9659-59210fd6d5c0/nmstate-console-plugin/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.696421 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q9h42_ef435627-8918-4451-8d3a-23e494e29f56/nmstate-handler/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.714708 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/nmstate-metrics/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.723275 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5df999bcf5-pztzb_e2d598ad-b9fa-4874-8669-688e18171e82/operator/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.723663 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/kube-rbac-proxy/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.737258 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-bwjbw_a0e928f6-ac84-4903-ab0e-08557dea077f/nmstate-operator/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.747363 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-8q4jc_9380b21a-b971-4bb9-9572-d795f171b941/nmstate-webhook/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.148323 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-674cd49df-nnf4r_86d68359-5910-4d1d-8a01-2964f8d26464/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.162829 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zjdsq_ba4d7dc5-ceca-4e4b-81af-9368937b7462/registry-server/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.174682 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.174979 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.251687 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-6ngwx_a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.288509 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-26j8t_2601732b-921a-4c55-821b-0fc994c50236/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.317571 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5qgh_99558a40-3dbc-4c2b-9aab-a085c7ef5c7c/operator/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.333312 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-t8tmg_9e235ee6-33ad-40e3-9b7a-914820315627/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.434550 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-2thqw_7ed73202-faba-46ba-ae91-8cd9ffbe70a4/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.450167 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7f4549b895-p2vwt_cfba823f-e85e-42ae-aa8a-7926cc906b92/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.462310 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-nhqxg_7f740208-043d-4d7f-b533-5526833d10c2/manager/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.027823 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/kube-multus-additional-cni-plugins/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.035910 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/egress-router-binary-copy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.043259 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/cni-plugins/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.049483 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/bond-cni-plugin/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.056640 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/routeoverride-cni/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.061761 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/whereabouts-cni-bincopy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.068561 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/whereabouts-cni/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.101081 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-x6fwb_deccf4fe-9230-4e96-b16c-a2ed0d2235a7/multus-admission-controller/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.113869 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-x6fwb_deccf4fe-9230-4e96-b16c-a2ed0d2235a7/kube-rbac-proxy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145219 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9"] Jan 20 19:30:00 crc kubenswrapper[4773]: E0120 19:30:00.145697 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145721 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" Jan 20 19:30:00 crc kubenswrapper[4773]: E0120 19:30:00.145739 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-utilities" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145746 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-utilities" Jan 20 19:30:00 crc kubenswrapper[4773]: E0120 19:30:00.145759 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-content" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145765 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-content" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145998 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.146758 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.148922 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.149734 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.156564 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9"] Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.178230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.178288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.178515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.184715 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.269810 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/3.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.280383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.280437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.280527 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.281299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.286271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.294900 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4jpbd_3791c4b7-dcef-470d-a67e-a2c0bb004436/network-metrics-daemon/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.300673 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4jpbd_3791c4b7-dcef-470d-a67e-a2c0bb004436/kube-rbac-proxy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.302604 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.470226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.891204 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9"] Jan 20 19:30:01 crc kubenswrapper[4773]: I0120 19:30:01.611730 4773 generic.go:334] "Generic (PLEG): container finished" podID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerID="42fb582633ba9007174eeae5e881990f39239a506ecbe079eebae42539c1b6a9" exitCode=0 Jan 20 19:30:01 crc kubenswrapper[4773]: I0120 19:30:01.612003 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" event={"ID":"a79f9bd1-bbc7-4506-b585-7f152b5f73f6","Type":"ContainerDied","Data":"42fb582633ba9007174eeae5e881990f39239a506ecbe079eebae42539c1b6a9"} Jan 20 19:30:01 crc kubenswrapper[4773]: I0120 19:30:01.612063 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" event={"ID":"a79f9bd1-bbc7-4506-b585-7f152b5f73f6","Type":"ContainerStarted","Data":"3135d289fe27a5514b5fed26d3f258e738419a2758c2552f386fe8f9e5775839"} Jan 20 19:30:02 crc kubenswrapper[4773]: I0120 19:30:02.916229 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.037176 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.037406 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.037482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.038168 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume" (OuterVolumeSpecName: "config-volume") pod "a79f9bd1-bbc7-4506-b585-7f152b5f73f6" (UID: "a79f9bd1-bbc7-4506-b585-7f152b5f73f6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.043668 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a79f9bd1-bbc7-4506-b585-7f152b5f73f6" (UID: "a79f9bd1-bbc7-4506-b585-7f152b5f73f6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.044384 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg" (OuterVolumeSpecName: "kube-api-access-nctvg") pod "a79f9bd1-bbc7-4506-b585-7f152b5f73f6" (UID: "a79f9bd1-bbc7-4506-b585-7f152b5f73f6"). InnerVolumeSpecName "kube-api-access-nctvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.139814 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.139844 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.139854 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") on node \"crc\" DevicePath \"\"" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.630865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" event={"ID":"a79f9bd1-bbc7-4506-b585-7f152b5f73f6","Type":"ContainerDied","Data":"3135d289fe27a5514b5fed26d3f258e738419a2758c2552f386fe8f9e5775839"} Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.631275 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3135d289fe27a5514b5fed26d3f258e738419a2758c2552f386fe8f9e5775839" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.630914 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.995720 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 19:30:04 crc kubenswrapper[4773]: I0120 19:30:04.004303 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 19:30:05 crc kubenswrapper[4773]: I0120 19:30:05.462776 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" path="/var/lib/kubelet/pods/a10b40f1-a7af-4ef6-ac5d-104e09a494d9/volumes" Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.034178 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.052270 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.064557 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.074643 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:30:17 crc kubenswrapper[4773]: I0120 19:30:17.466528 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" path="/var/lib/kubelet/pods/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6/volumes" Jan 20 19:30:17 crc kubenswrapper[4773]: I0120 19:30:17.467773 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" path="/var/lib/kubelet/pods/80285eae-2998-47ab-bcd6-e9905e2e71d4/volumes" Jan 20 19:30:21 crc kubenswrapper[4773]: I0120 19:30:21.009426 4773 scope.go:117] "RemoveContainer" containerID="572888ce06611feefec987174ae3eee5c299fc9038199817efe0a94604e5aae9" Jan 20 19:30:21 crc kubenswrapper[4773]: I0120 19:30:21.039338 4773 scope.go:117] "RemoveContainer" containerID="0827af644398a715247b27083b551541f42dae2b0a3150620bd9104ee37e5138" Jan 20 19:30:21 crc kubenswrapper[4773]: I0120 19:30:21.094112 4773 scope.go:117] "RemoveContainer" containerID="5831be469a4fe2b76e1bccd6344f54cbf800b2124dcc48460a5c9ae662bb240a" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.171226 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.171881 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.171986 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.172880 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.173044 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" gracePeriod=600 Jan 20 19:30:28 crc kubenswrapper[4773]: E0120 19:30:28.296613 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.844889 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" exitCode=0 Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.844980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d"} Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.845052 4773 scope.go:117] "RemoveContainer" containerID="f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.845757 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:30:28 crc kubenswrapper[4773]: E0120 19:30:28.846096 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:30:36 crc kubenswrapper[4773]: I0120 19:30:36.060148 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:30:36 crc kubenswrapper[4773]: I0120 19:30:36.069997 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:30:37 crc kubenswrapper[4773]: I0120 19:30:37.479682 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" path="/var/lib/kubelet/pods/32b245ce-84e1-4fbc-adef-ebfdd1e88d77/volumes" Jan 20 19:30:42 crc kubenswrapper[4773]: I0120 19:30:42.448125 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:30:42 crc kubenswrapper[4773]: E0120 19:30:42.448972 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:30:54 crc kubenswrapper[4773]: I0120 19:30:54.447877 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:30:54 crc kubenswrapper[4773]: E0120 19:30:54.449214 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:07 crc kubenswrapper[4773]: I0120 19:31:07.452460 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:07 crc kubenswrapper[4773]: E0120 19:31:07.453241 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:21 crc kubenswrapper[4773]: I0120 19:31:21.223731 4773 scope.go:117] "RemoveContainer" containerID="4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241" Jan 20 19:31:21 crc kubenswrapper[4773]: I0120 19:31:21.447165 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:21 crc kubenswrapper[4773]: E0120 19:31:21.447600 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:33 crc kubenswrapper[4773]: I0120 19:31:33.447560 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:33 crc kubenswrapper[4773]: E0120 19:31:33.448337 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:45 crc kubenswrapper[4773]: I0120 19:31:45.450962 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:45 crc kubenswrapper[4773]: E0120 19:31:45.452744 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:57 crc kubenswrapper[4773]: I0120 19:31:57.459187 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:57 crc kubenswrapper[4773]: E0120 19:31:57.460056 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:11 crc kubenswrapper[4773]: I0120 19:32:11.451634 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:11 crc kubenswrapper[4773]: E0120 19:32:11.453672 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:24 crc kubenswrapper[4773]: I0120 19:32:24.447494 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:24 crc kubenswrapper[4773]: E0120 19:32:24.448289 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:38 crc kubenswrapper[4773]: I0120 19:32:38.446804 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:38 crc kubenswrapper[4773]: E0120 19:32:38.447645 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:52 crc kubenswrapper[4773]: I0120 19:32:52.447428 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:52 crc kubenswrapper[4773]: E0120 19:32:52.448550 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:04 crc kubenswrapper[4773]: I0120 19:33:04.448600 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:04 crc kubenswrapper[4773]: E0120 19:33:04.450315 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:15 crc kubenswrapper[4773]: I0120 19:33:15.452481 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:15 crc kubenswrapper[4773]: E0120 19:33:15.453170 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:21 crc kubenswrapper[4773]: I0120 19:33:21.299407 4773 scope.go:117] "RemoveContainer" containerID="850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82" Jan 20 19:33:30 crc kubenswrapper[4773]: I0120 19:33:30.449082 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:30 crc kubenswrapper[4773]: E0120 19:33:30.450543 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:44 crc kubenswrapper[4773]: I0120 19:33:44.448110 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:44 crc kubenswrapper[4773]: E0120 19:33:44.448853 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:59 crc kubenswrapper[4773]: I0120 19:33:59.448290 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:59 crc kubenswrapper[4773]: E0120 19:33:59.449240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:12 crc kubenswrapper[4773]: I0120 19:34:12.447426 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:12 crc kubenswrapper[4773]: E0120 19:34:12.448263 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:23 crc kubenswrapper[4773]: I0120 19:34:23.451725 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:23 crc kubenswrapper[4773]: E0120 19:34:23.452704 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:38 crc kubenswrapper[4773]: I0120 19:34:38.448614 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:38 crc kubenswrapper[4773]: E0120 19:34:38.449563 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:52 crc kubenswrapper[4773]: I0120 19:34:52.447331 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:52 crc kubenswrapper[4773]: E0120 19:34:52.448110 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:35:03 crc kubenswrapper[4773]: I0120 19:35:03.447128 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:35:03 crc kubenswrapper[4773]: E0120 19:35:03.447912 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.576320 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:13 crc kubenswrapper[4773]: E0120 19:35:13.577223 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerName="collect-profiles" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.577235 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerName="collect-profiles" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.578273 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerName="collect-profiles" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.585531 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.622057 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.722793 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.723181 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.723376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825360 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825898 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825978 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.843880 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.927552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:14 crc kubenswrapper[4773]: I0120 19:35:14.427823 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.395109 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" exitCode=0 Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.395469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7"} Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.395497 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerStarted","Data":"7411dc2283de4bdc0a3861974c303bfbfd255e4c35e2114ad37c24e4e22985e5"} Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.397398 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:35:16 crc kubenswrapper[4773]: I0120 19:35:16.943198 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:16 crc kubenswrapper[4773]: I0120 19:35:16.945395 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:16 crc kubenswrapper[4773]: I0120 19:35:16.972206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.117146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.117552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.117627 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219463 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219652 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.220194 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.249831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.284375 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.464156 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:35:17 crc kubenswrapper[4773]: E0120 19:35:17.465161 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.764719 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:17 crc kubenswrapper[4773]: W0120 19:35:17.776248 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf0ac612_5a13_44eb_942e_52b7fa9a9c2f.slice/crio-4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac WatchSource:0}: Error finding container 4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac: Status 404 returned error can't find the container with id 4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.951048 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" exitCode=0 Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.951222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326"} Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.959641 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerStarted","Data":"4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac"} Jan 20 19:35:18 crc kubenswrapper[4773]: I0120 19:35:18.967514 4773 generic.go:334] "Generic (PLEG): container finished" podID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" exitCode=0 Jan 20 19:35:18 crc kubenswrapper[4773]: I0120 19:35:18.967576 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b"} Jan 20 19:35:18 crc kubenswrapper[4773]: I0120 19:35:18.971156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerStarted","Data":"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229"} Jan 20 19:35:19 crc kubenswrapper[4773]: I0120 19:35:19.017740 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qp7pd" podStartSLOduration=2.983356832 podStartE2EDuration="6.017674644s" podCreationTimestamp="2026-01-20 19:35:13 +0000 UTC" firstStartedPulling="2026-01-20 19:35:15.396951585 +0000 UTC m=+3908.318764609" lastFinishedPulling="2026-01-20 19:35:18.431269397 +0000 UTC m=+3911.353082421" observedRunningTime="2026-01-20 19:35:19.012569622 +0000 UTC m=+3911.934382646" watchObservedRunningTime="2026-01-20 19:35:19.017674644 +0000 UTC m=+3911.939487688" Jan 20 19:35:20 crc kubenswrapper[4773]: I0120 19:35:20.989996 4773 generic.go:334] "Generic (PLEG): container finished" podID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" exitCode=0 Jan 20 19:35:20 crc kubenswrapper[4773]: I0120 19:35:20.990126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c"} Jan 20 19:35:22 crc kubenswrapper[4773]: I0120 19:35:22.000879 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerStarted","Data":"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578"} Jan 20 19:35:22 crc kubenswrapper[4773]: I0120 19:35:22.033411 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dp4ts" podStartSLOduration=3.5885728070000003 podStartE2EDuration="6.033387478s" podCreationTimestamp="2026-01-20 19:35:16 +0000 UTC" firstStartedPulling="2026-01-20 19:35:18.96939571 +0000 UTC m=+3911.891208734" lastFinishedPulling="2026-01-20 19:35:21.414210381 +0000 UTC m=+3914.336023405" observedRunningTime="2026-01-20 19:35:22.027795363 +0000 UTC m=+3914.949608387" watchObservedRunningTime="2026-01-20 19:35:22.033387478 +0000 UTC m=+3914.955200502" Jan 20 19:35:23 crc kubenswrapper[4773]: I0120 19:35:23.927689 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:23 crc kubenswrapper[4773]: I0120 19:35:23.929068 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:23 crc kubenswrapper[4773]: I0120 19:35:23.978516 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:24 crc kubenswrapper[4773]: I0120 19:35:24.059317 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:27 crc kubenswrapper[4773]: I0120 19:35:27.284500 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:27 crc kubenswrapper[4773]: I0120 19:35:27.286123 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:27 crc kubenswrapper[4773]: I0120 19:35:27.330426 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:28 crc kubenswrapper[4773]: I0120 19:35:28.089955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:28 crc kubenswrapper[4773]: I0120 19:35:28.337969 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:28 crc kubenswrapper[4773]: I0120 19:35:28.338197 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qp7pd" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" containerID="cri-o://9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" gracePeriod=2 Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.649867 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.699063 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.699240 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.699272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.700049 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities" (OuterVolumeSpecName: "utilities") pod "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" (UID: "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.705311 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv" (OuterVolumeSpecName: "kube-api-access-fkhxv") pod "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" (UID: "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d"). InnerVolumeSpecName "kube-api-access-fkhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.742539 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" (UID: "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.803006 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.803485 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.803498 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095122 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" exitCode=0 Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095184 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229"} Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095221 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"7411dc2283de4bdc0a3861974c303bfbfd255e4c35e2114ad37c24e4e22985e5"} Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095244 4773 scope.go:117] "RemoveContainer" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095280 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.132369 4773 scope.go:117] "RemoveContainer" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.134686 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.142677 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.170973 4773 scope.go:117] "RemoveContainer" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.175256 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.175574 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dp4ts" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" containerID="cri-o://e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" gracePeriod=2 Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.215028 4773 scope.go:117] "RemoveContainer" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" Jan 20 19:35:31 crc kubenswrapper[4773]: E0120 19:35:31.216866 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229\": container with ID starting with 9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229 not found: ID does not exist" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.216917 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229"} err="failed to get container status \"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229\": rpc error: code = NotFound desc = could not find container \"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229\": container with ID starting with 9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229 not found: ID does not exist" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.217017 4773 scope.go:117] "RemoveContainer" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" Jan 20 19:35:31 crc kubenswrapper[4773]: E0120 19:35:31.217888 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326\": container with ID starting with ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326 not found: ID does not exist" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.217917 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326"} err="failed to get container status \"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326\": rpc error: code = NotFound desc = could not find container \"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326\": container with ID starting with ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326 not found: ID does not exist" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.217954 4773 scope.go:117] "RemoveContainer" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" Jan 20 19:35:31 crc kubenswrapper[4773]: E0120 19:35:31.220067 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7\": container with ID starting with dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7 not found: ID does not exist" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.220109 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7"} err="failed to get container status \"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7\": rpc error: code = NotFound desc = could not find container \"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7\": container with ID starting with dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7 not found: ID does not exist" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.466348 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" path="/var/lib/kubelet/pods/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d/volumes" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.664361 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.738508 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.738642 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.739541 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.740940 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities" (OuterVolumeSpecName: "utilities") pod "df0ac612-5a13-44eb-942e-52b7fa9a9c2f" (UID: "df0ac612-5a13-44eb-942e-52b7fa9a9c2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.743990 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw" (OuterVolumeSpecName: "kube-api-access-48vvw") pod "df0ac612-5a13-44eb-942e-52b7fa9a9c2f" (UID: "df0ac612-5a13-44eb-942e-52b7fa9a9c2f"). InnerVolumeSpecName "kube-api-access-48vvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.841581 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.841621 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.855008 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df0ac612-5a13-44eb-942e-52b7fa9a9c2f" (UID: "df0ac612-5a13-44eb-942e-52b7fa9a9c2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.943226 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106605 4773 generic.go:334] "Generic (PLEG): container finished" podID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" exitCode=0 Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106682 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578"} Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac"} Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106719 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.107008 4773 scope.go:117] "RemoveContainer" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.131514 4773 scope.go:117] "RemoveContainer" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.140023 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.148671 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.177006 4773 scope.go:117] "RemoveContainer" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.216966 4773 scope.go:117] "RemoveContainer" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" Jan 20 19:35:32 crc kubenswrapper[4773]: E0120 19:35:32.217487 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578\": container with ID starting with e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578 not found: ID does not exist" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.217519 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578"} err="failed to get container status \"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578\": rpc error: code = NotFound desc = could not find container \"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578\": container with ID starting with e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578 not found: ID does not exist" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.217542 4773 scope.go:117] "RemoveContainer" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" Jan 20 19:35:32 crc kubenswrapper[4773]: E0120 19:35:32.217947 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c\": container with ID starting with 7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c not found: ID does not exist" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.217995 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c"} err="failed to get container status \"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c\": rpc error: code = NotFound desc = could not find container \"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c\": container with ID starting with 7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c not found: ID does not exist" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.218021 4773 scope.go:117] "RemoveContainer" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" Jan 20 19:35:32 crc kubenswrapper[4773]: E0120 19:35:32.218481 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b\": container with ID starting with 0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b not found: ID does not exist" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.218506 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b"} err="failed to get container status \"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b\": rpc error: code = NotFound desc = could not find container \"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b\": container with ID starting with 0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b not found: ID does not exist" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.447437 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:35:33 crc kubenswrapper[4773]: I0120 19:35:33.119977 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162"} Jan 20 19:35:33 crc kubenswrapper[4773]: I0120 19:35:33.457806 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" path="/var/lib/kubelet/pods/df0ac612-5a13-44eb-942e-52b7fa9a9c2f/volumes" Jan 20 19:35:35 crc kubenswrapper[4773]: E0120 19:35:35.470981 4773 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:45582->38.102.83.39:34695: write tcp 38.102.83.39:45582->38.102.83.39:34695: write: connection reset by peer Jan 20 19:36:26 crc kubenswrapper[4773]: I0120 19:36:26.565380 4773 generic.go:334] "Generic (PLEG): container finished" podID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" exitCode=0 Jan 20 19:36:26 crc kubenswrapper[4773]: I0120 19:36:26.565428 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerDied","Data":"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124"} Jan 20 19:36:26 crc kubenswrapper[4773]: I0120 19:36:26.568089 4773 scope.go:117] "RemoveContainer" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.283159 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.283791 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.283860 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.283925 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284003 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284076 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284129 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284236 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284289 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284339 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284400 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284475 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284692 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284764 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.286138 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.297397 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.427994 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mzdmf_must-gather-lp22t_ae725e5b-de4d-443b-bd8c-985abdcb0f87/gather/0.log" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.457332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.457867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.458343 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.560542 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.561075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.561625 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.561771 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.562181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.582584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.661371 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.167556 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.584486 4773 generic.go:334] "Generic (PLEG): container finished" podID="0866c773-75be-4641-b2d6-74b9944abe6d" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" exitCode=0 Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.584639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5"} Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.584764 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerStarted","Data":"09c45e26a48800d21ad8108a306a81ecbff994d1af4c00904479b9c9116d1432"} Jan 20 19:36:30 crc kubenswrapper[4773]: I0120 19:36:30.608697 4773 generic.go:334] "Generic (PLEG): container finished" podID="0866c773-75be-4641-b2d6-74b9944abe6d" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" exitCode=0 Jan 20 19:36:30 crc kubenswrapper[4773]: I0120 19:36:30.608759 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8"} Jan 20 19:36:31 crc kubenswrapper[4773]: I0120 19:36:31.652313 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerStarted","Data":"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e"} Jan 20 19:36:31 crc kubenswrapper[4773]: I0120 19:36:31.673892 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9nkd6" podStartSLOduration=2.060796981 podStartE2EDuration="4.673868518s" podCreationTimestamp="2026-01-20 19:36:27 +0000 UTC" firstStartedPulling="2026-01-20 19:36:28.586492778 +0000 UTC m=+3981.508305802" lastFinishedPulling="2026-01-20 19:36:31.199564315 +0000 UTC m=+3984.121377339" observedRunningTime="2026-01-20 19:36:31.668580481 +0000 UTC m=+3984.590393505" watchObservedRunningTime="2026-01-20 19:36:31.673868518 +0000 UTC m=+3984.595681542" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.046300 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.047429 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mzdmf/must-gather-lp22t" podUID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" containerName="copy" containerID="cri-o://1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" gracePeriod=2 Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.058039 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.497176 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mzdmf_must-gather-lp22t_ae725e5b-de4d-443b-bd8c-985abdcb0f87/copy/0.log" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.497944 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.653615 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.653673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.663182 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp" (OuterVolumeSpecName: "kube-api-access-khbtp") pod "ae725e5b-de4d-443b-bd8c-985abdcb0f87" (UID: "ae725e5b-de4d-443b-bd8c-985abdcb0f87"). InnerVolumeSpecName "kube-api-access-khbtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.690920 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mzdmf_must-gather-lp22t_ae725e5b-de4d-443b-bd8c-985abdcb0f87/copy/0.log" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.691670 4773 generic.go:334] "Generic (PLEG): container finished" podID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" exitCode=143 Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.691735 4773 scope.go:117] "RemoveContainer" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.691878 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.712037 4773 scope.go:117] "RemoveContainer" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.756451 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.804222 4773 scope.go:117] "RemoveContainer" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" Jan 20 19:36:36 crc kubenswrapper[4773]: E0120 19:36:36.805617 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95\": container with ID starting with 1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95 not found: ID does not exist" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.805673 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95"} err="failed to get container status \"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95\": rpc error: code = NotFound desc = could not find container \"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95\": container with ID starting with 1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95 not found: ID does not exist" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.805709 4773 scope.go:117] "RemoveContainer" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:36 crc kubenswrapper[4773]: E0120 19:36:36.806351 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124\": container with ID starting with 2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124 not found: ID does not exist" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.806380 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124"} err="failed to get container status \"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124\": rpc error: code = NotFound desc = could not find container \"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124\": container with ID starting with 2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124 not found: ID does not exist" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.858744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ae725e5b-de4d-443b-bd8c-985abdcb0f87" (UID: "ae725e5b-de4d-443b-bd8c-985abdcb0f87"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.960223 4773 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.457956 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" path="/var/lib/kubelet/pods/ae725e5b-de4d-443b-bd8c-985abdcb0f87/volumes" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.662381 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.662447 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.709226 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.763417 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.950847 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:39 crc kubenswrapper[4773]: I0120 19:36:39.716449 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9nkd6" podUID="0866c773-75be-4641-b2d6-74b9944abe6d" containerName="registry-server" containerID="cri-o://c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" gracePeriod=2 Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.251529 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.319974 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"0866c773-75be-4641-b2d6-74b9944abe6d\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.320301 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"0866c773-75be-4641-b2d6-74b9944abe6d\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.320337 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"0866c773-75be-4641-b2d6-74b9944abe6d\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.328869 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities" (OuterVolumeSpecName: "utilities") pod "0866c773-75be-4641-b2d6-74b9944abe6d" (UID: "0866c773-75be-4641-b2d6-74b9944abe6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.332246 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq" (OuterVolumeSpecName: "kube-api-access-8dgwq") pod "0866c773-75be-4641-b2d6-74b9944abe6d" (UID: "0866c773-75be-4641-b2d6-74b9944abe6d"). InnerVolumeSpecName "kube-api-access-8dgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.360532 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0866c773-75be-4641-b2d6-74b9944abe6d" (UID: "0866c773-75be-4641-b2d6-74b9944abe6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.423148 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.423331 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.423385 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727112 4773 generic.go:334] "Generic (PLEG): container finished" podID="0866c773-75be-4641-b2d6-74b9944abe6d" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" exitCode=0 Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e"} Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"09c45e26a48800d21ad8108a306a81ecbff994d1af4c00904479b9c9116d1432"} Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727211 4773 scope.go:117] "RemoveContainer" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727220 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.754072 4773 scope.go:117] "RemoveContainer" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.763283 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.771920 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.787195 4773 scope.go:117] "RemoveContainer" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.823630 4773 scope.go:117] "RemoveContainer" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" Jan 20 19:36:40 crc kubenswrapper[4773]: E0120 19:36:40.824328 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e\": container with ID starting with c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e not found: ID does not exist" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824375 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e"} err="failed to get container status \"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e\": rpc error: code = NotFound desc = could not find container \"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e\": container with ID starting with c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e not found: ID does not exist" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824402 4773 scope.go:117] "RemoveContainer" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" Jan 20 19:36:40 crc kubenswrapper[4773]: E0120 19:36:40.824767 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8\": container with ID starting with 3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8 not found: ID does not exist" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824800 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8"} err="failed to get container status \"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8\": rpc error: code = NotFound desc = could not find container \"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8\": container with ID starting with 3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8 not found: ID does not exist" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824821 4773 scope.go:117] "RemoveContainer" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" Jan 20 19:36:40 crc kubenswrapper[4773]: E0120 19:36:40.825339 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5\": container with ID starting with a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5 not found: ID does not exist" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.825379 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5"} err="failed to get container status \"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5\": rpc error: code = NotFound desc = could not find container \"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5\": container with ID starting with a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5 not found: ID does not exist" Jan 20 19:36:41 crc kubenswrapper[4773]: I0120 19:36:41.458254 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0866c773-75be-4641-b2d6-74b9944abe6d" path="/var/lib/kubelet/pods/0866c773-75be-4641-b2d6-74b9944abe6d/volumes" Jan 20 19:37:58 crc kubenswrapper[4773]: I0120 19:37:58.169989 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:37:58 crc kubenswrapper[4773]: I0120 19:37:58.170547 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:38:28 crc kubenswrapper[4773]: I0120 19:38:28.170387 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:38:28 crc kubenswrapper[4773]: I0120 19:38:28.171104 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.170643 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171156 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171198 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171847 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171891 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162" gracePeriod=600 Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.993685 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162" exitCode=0 Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.994004 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162"} Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.994035 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"73d06ef6f46d1d40d9ec469befd531111e1aaac92931f7dec4b5155df844c18a"} Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.994053 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515133755017024454 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015133755020017363 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015133744341016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015133744341015462 5ustar corecore